Logo Logo
Hilfe
Hilfe
Switch Language to English

Probst, Philipp; Wright, Marvin N. und Boulesteix, Anne-Laure (2019): Hyperparameters and tuning strategies for random forest. In: Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, Bd. 9, Nr. 3, e1301

Volltext auf 'Open Access LMU' nicht verfügbar.

Abstract

The random forest (RF) algorithm has several hyperparameters that have to be set by the user, for example, the number of observations drawn randomly for each tree and whether they are drawn with or without replacement, the number of variables drawn randomly for each split, the splitting rule, the minimum number of samples that a node must contain, and the number of trees. In this paper, we first provide a literature review on the parameters' influence on the prediction performance and on variable importance measures. It is well known that in most cases RF works reasonably well with the default values of the hyperparameters specified in software packages. Nevertheless, tuning the hyperparameters can improve the performance of RF. In the second part of this paper, after a presenting brief overview of tuning strategies, we demonstrate the application of one of the most established tuning strategies, model-based optimization (MBO). To make it easier to use, we provide the tuneRanger R package that tunes RF with MBO automatically. In a benchmark study on several datasets, we compare the prediction performance and runtime of tuneRanger with other tuning implementations in R and RF with default hyperparameters. This article is categorized under: Algorithmic Development > Biological Data Mining Algorithmic Development > Statistics Algorithmic Development > Hierarchies and Trees Technologies > Machine Learning

Dokument bearbeiten Dokument bearbeiten