Logo Logo
Hilfe
Hilfe
Switch Language to English

Gossmann, Alexej; Cha, Kenny H. und Sun, Xudong (2020): Performance deterioration of deep neural networks for lesion classification in mammography due to distribution shift: an analysis based on artificially created distribution shift. In: Medical Imaging 2020: Computer-Aided Diagnosis, Bd. 11314, 1131404

Volltext auf 'Open Access LMU' nicht verfügbar.

Abstract

Despite the prominent success of deep learning (DL) in medical imaging for tasks such as computer-aided detection and diagnosis, the field faces a number of challenging problems. An important issue is that of mismatch of data distributions between different data sources, also known as a distribution shift. Distribution shifts may also be present between different subpopulations or subgroups. Distribution shifts that are not easily detectable can prevent the successful deployment of DL models in medical imaging. We use variational inference to create subsets of a given dataset while enforcing artificial distribution shifts between these subsets, thus creating subsets with different characteristics that represent different pseudo "data sources". By training and testing ROI-based malignant/benign lesion classification models over these pseudo data sources, we evaluate the extent to which distribution shift could deteriorate the performance of popular DL models. We show that distribution shift indeed poses a serious concern for malignant/benign lesion classification in mammography, and we show that the algorithmically created pseudo data sources may not correspond to any recorded clinical or image characteristics. This study shows a potential method for evaluating deep learning algorithms for robustness against distribution shifts. Furthermore, our technique can serve as a benchmark method for development of new models which aim to be robust to distribution shift.

Dokument bearbeiten Dokument bearbeiten