Abstract
The performance of many machine learning algorithms heavily depends on the setting of their respective hyperparameters. Many different tuning approaches exist, from simple grid or random search approaches to evolutionary algorithms and Bayesian optimization. Often, these algorithms are used to optimize a single performance criterion. But in practical applications, a single criterion may not be sufficient to adequately characterize the behavior of the machine learning method under consideration and the Pareto front of multiple criteria has to be considered. We propose to use model-based multi-objective optimization to efficiently approximate such Pareto fronts. Furthermore, the parameter space of many machine learning algorithms not only consists of numeric, but also categorical, integer or even hierarchical parameters, which is a general characteristic of many algorithm configuration tasks. Such mixed and hierarchical parameter space structures will be created naturally when one simultaneously performs model selection over a whole class of different possible prediction algorithms. Instead of tuning each algorithm individually, our approach can be used to configure machine learning pipelines with such a hierarchical structure, and efficiently operates on the joint space of all considered algorithms, in a multi-objective setting. Our optimization method is readily available as part of the mlrMBO R package on Github. We compare its performance against the TunePareto R package and the regular Latin Hypercube Sampling. We use a pure numerical setting of SVM parameter tuning and a mixed, hierarchical setting in which we optimize over multiple model spaces at once.
Item Type: | Conference or Workshop Item (Report) |
---|---|
Faculties: | Mathematics, Computer Science and Statistics > Computer Science |
Subjects: | 000 Computer science, information and general works > 004 Data processing computer science |
Language: | English |
Item ID: | 47389 |
Date Deposited: | 27. Apr 2018, 08:12 |
Last Modified: | 13. Aug 2024, 12:54 |