Logo Logo
Hilfe
Hilfe
Switch Language to English

Haas, Stefan ORCID logoORCID: https://orcid.org/0000-0001-9916-0060 und Hüllermeier, Eyke ORCID logoORCID: https://orcid.org/0000-0002-9944-4108 (2025): Uncertainty quantification in ordinal classification: A comparison of measures. In: International Journal of Approximate Reasoning, Bd. 186, 109479 [PDF, 4MB]

[thumbnail of 1-s2.0-S0888613X25001203-main.pdf]
Vorschau
Creative Commons: Namensnennung 4.0 (CC-BY)
Veröffentlichte Version

Abstract

Uncertainty quantification has received increasing attention in machine learning in the recent past, but the focus has mostly been on standard (nominal) classification and regression so far. In this paper, we address the question of how to quantify uncertainty in ordinal classification, where class labels have a natural (linear) order. We reckon that commonly used uncertainty measures such as Shannon entropy, confidence, or margin are not appropriate for the ordinal case. In our search for better measures, we draw inspiration from the social sciences literature, which offers various measures to assess so-called consensus or agreement in ordinal data. We argue that these measures, or, more specifically, the dual measures of dispersion or polarization, do have properties that qualify them as measures of uncertainty. Furthermore, inspired by binary decomposition techniques for multi-class classification in machine learning, we propose a new method that allows for turning any uncertainty measure into an ordinal uncertainty measure in a generic way. We evaluate all measures in an empirical study on twenty-three ordinal benchmark datasets, as well as in a real-world case study on automotive goodwill claim assessment. Our studies confirm that dispersion measures and our binary decomposition method surpass conventional (nominal) uncertainty measures.

Dokument bearbeiten Dokument bearbeiten