ORCID: https://orcid.org/0000-0002-8627-1260; Bukas, Christina; Meissen, Felix; Peng, Tingying; Ertürk, Ali
ORCID: https://orcid.org/0000-0001-5163-5100; Rueckert, Daniel; Heckemann, Rolf; Kirschke, Jan; Zimmer, Claus; Wiestler, Benedikt; Menze, Bjoern und Piraud, Marie
(2023):
Approaching Peak Ground Truth.
20th IEEE International Symposium on Biomedical Imaging (ISBI), Cartagena, Colombia, 11. - 16. September 2022.
Institute of Electrical and Electronics Engineers (Hrsg.),
In: 2023 IEEE 20th International Symposium on Biomedical Imaging (ISBI),
Piscataway: IEEE.
Abstract
Machine learning models are typically evaluated by computing similarity with reference annotations and trained by maximizing similarity with such. Especially in the biomedical domain, annotations are subjective and suffer from low inter-and intra-rater reliability. Since annotations only reflect one interpretation of the real world, this can lead to sub-optimal predictions even though the model achieves high similarity scores. Here, the theoretical concept of Peak Ground Truth (PGT) is introduced. PGT marks the point beyond which an increase in similarity with the reference annotation stops translating to better Real World Model Performance (RWMP). Additionally, a quantitative technique to approximate PGT by computing inter- and intra-rater reliability is proposed. Finally, four categories of PGT-aware strategies to evaluate and improve model performance are reviewed.
Dokumententyp: | Konferenzbeitrag (Paper) |
---|---|
Fakultät: | Medizin > Munich Cluster for Systems Neurology (SyNergy)
Medizin > Institut für Schlaganfall- und Demenzforschung (ISD) |
Themengebiete: | 000 Informatik, Informationswissenschaft, allgemeine Werke > 004 Informatik
600 Technik, Medizin, angewandte Wissenschaften > 610 Medizin und Gesundheit |
ISBN: | 978-1-6654-7358-3 ; 978-1-6654-7359-0 |
Ort: | Piscataway |
Sprache: | Englisch |
Dokumenten ID: | 124318 |
Datum der Veröffentlichung auf Open Access LMU: | 26. Feb. 2025 06:56 |
Letzte Änderungen: | 26. Feb. 2025 06:56 |
DFG: | Gefördert durch die Deutsche Forschungsgemeinschaft (DFG) - 390857198 |