Logo Logo
Help
Contact
Switch Language to German

Dembczyński, Krzysztof; Kotłowski, Wojciech and Hüllermeier, Eyke ORCID logoORCID: https://orcid.org/0000-0002-9944-4108 (June 2012): Consistent multilabel ranking through univariate loss minimization. ICML'12: 29th International Coference on International Conference on Machine Learning, Edinburgh, Scotland, UK, 26 June 2012- 1 July 2012. Langford, John and Pineau, Joelle (eds.) : In: Proceedings of the Twenty-Ninth International Conference on Machine Learning, pp. 1347-1354

Full text not available from 'Open Access LMU'.

Abstract

We consider the problem of rank loss minimization in the setting of multilabel classification, which is usually tackled by means of convex surrogate losses defined on pairs of labels. Very recently, this approach was put into question by a negative result showing that commonly used pairwise surrogate losses, such as exponential and logistic losses, are inconsistent. In this paper, we show a positive result which is arguably surprising in light of the previous one: the simpler univariate variants of exponential and logistic surrogates (i.e., defined on single labels) are consistent for rank loss minimization. Instead of directly proving convergence, we give a much stronger result by deriving regret bounds and convergence rates. The proposed losses suggest efficient and scalable algorithms, which are tested experimentally.

Actions (login required)

View Item View Item