Abstract
Ridge regression is a well established method to shrink regression parameters towards zero, thereby securing existence of estimates. The present paper investigates several approaches to combining ridge regression with boosting techniques. In the direct approach the ridge estimator is used to fit iteratively the current residuals yielding an alternative to the usual ridge estimator. In partial boosting only part of the regression parameters are reestimated within one step of the iterative procedure. The technique allows to distinguish between variables that are always included in the analysis and variables that are chosen only if relevant. The resulting procedure selects variables in a similar way as the Lasso, yielding a reduced set of influential variables. The suggested procedures are investigated within the classical framework of continuous response variables as well as in the case of generalized linear models. In a simulation study boosting procedures for different stopping criteria are investigated and the performance in terms of prediction and the identification of relevant variables is compared to several competitors as the Lasso and the more recently proposed elastic net.
Item Type: | Paper |
---|---|
Keywords: | Ridge regression, boosting, Lasso, Pseudo ROC curves |
Faculties: | Mathematics, Computer Science and Statistics > Statistics > Collaborative Research Center 386 Special Research Fields > Special Research Field 386 |
Subjects: | 500 Science > 510 Mathematics |
URN: | urn:nbn:de:bvb:19-epub-1787-5 |
Language: | English |
Item ID: | 1787 |
Date Deposited: | 11. Apr 2007 |
Last Modified: | 04. Nov 2020, 12:45 |