Abstract
Purpose: Aim of this study was to develop and evaluate a software toolkit, which allows for a fully automated body composition analysis in contrast enhanced abdominal computed tomography leveraging the strengths of both, quantitative information from dual energy computed tomography and simple detection and segmentation tasks performed by deep convolutional neuronal networks (DCNN). Methods and materials: Both, public and private datasets were used to train and validate DCNN. A combination of two DCNN and quantitative thresholding was used to classify axial CT slices to the abdominal region, classify voxels as fat and muscle and to differentiate between subcutaneous and visceral fat. For validation, patients undergoing repetitive examination (+/- 21 days) and patients who underwent concurrent bioelectrical impedance analysis (BIA) were analyzed. Concordance correlation coefficient (CCC), linear regression and Bland-Altman-Analysis were used as statistical tests. Results: Results provided from the algorithm toolkit were visually validated. The automated classifier was able to extract slices of interest from the full body scans with an accuracy of 98.7 %. DCNN-based segmentation for subcutaneous fat reached a mean dice similarity coefficient of 0.95. CCCs were 0.99 for both muscle and subcutaneous fat and 0.98 for visceral fat in patients undergoing repetitive examinations (n = 48). Further linear regression and Bland-Altman-Analyses suggested good agreement (r(2):0.67-0.88) between the software toolkit and patients who underwent concurrent BIA (n = 39). Conclusion: We describe a software toolkit allowing for an accurate analysis of body composition utilizing a combination of DCNN- and threshold-based segmentations from spectral detector computed tomography.
Dokumententyp: | Zeitschriftenartikel |
---|---|
Fakultät: | Medizin |
Themengebiete: | 600 Technik, Medizin, angewandte Wissenschaften > 610 Medizin und Gesundheit |
ISSN: | 0720-048X |
Sprache: | Englisch |
Dokumenten ID: | 85649 |
Datum der Veröffentlichung auf Open Access LMU: | 25. Jan. 2022, 09:15 |
Letzte Änderungen: | 25. Jan. 2022, 09:15 |