Abstract
Classification trees based on imprecise probabilities provide an advancement of classical classification trees. The Gini Index is the default splitting criterion in classical classification trees, while in classification trees based on imprecise probabilities, an extension of the Shannon entropy has been introduced as the splitting criterion. However, the use of these empirical entropy measures as split selection criteria can lead to a bias in variable selection, such that variables are preferred for features other than their information content. This bias is not eliminated by the imprecise probability approach. The source of variable selection bias for the estimated Shannon entropy, as well as possible corrections, are outlined. The variable selection performance of the biased and corrected estimators are evaluated in a simulation study. Additional results from research on variable selection bias in classical classification trees are incorporated, implying further investigation of alternative split selection criteria in classification trees based on imprecise probabilities.
Dokumententyp: | Paper |
---|---|
Fakultät: | Mathematik, Informatik und Statistik > Statistik > Sonderforschungsbereich 386
Sonderforschungsbereiche > Sonderforschungsbereich 386 |
Themengebiete: | 500 Naturwissenschaften und Mathematik > 510 Mathematik |
URN: | urn:nbn:de:bvb:19-epub-1788-0 |
Sprache: | Englisch |
Dokumenten ID: | 1788 |
Datum der Veröffentlichung auf Open Access LMU: | 11. Apr. 2007 |
Letzte Änderungen: | 04. Nov. 2020, 12:45 |