Abstract
Motor behaviour analysis is essential to biomedical research and clinical diagnostics as it provides a non-invasive strategy for identifying motor impairment and its change caused by interventions. State-of-the-art instrumented movement analysis is time- and cost-intensive, because it requires the placement of physical or virtual markers. As well as the effort required for marking the keypoints or annotations necessary for training or fine-tuning a detector, users need to know the interesting behaviour beforehand to provide meaningful keypoints. Here, we introduce unsupervised behaviour analysis and magnification (uBAM), an automatic deep learning algorithm for analysing behaviour by discovering and magnifying deviations. A central aspect is unsupervised learning of posture and behaviour representations to enable an objective comparison of movement. Besides discovering and quantifying deviations in behaviour, we also propose a generative model for visually magnifying subtle behaviour differences directly in a video without requiring a detour via keypoints or annotations. Essential for this magnification of deviations, even across different individuals, is a disentangling of appearance and behaviour. Evaluations on rodents and human patients with neurological diseases demonstrate the wide applicability of our approach. Moreover, combining optogenetic stimulation with our unsupervised behaviour analysis shows its suitability as a non-invasive diagnostic tool correlating function to brain plasticity.
Dokumententyp: | Zeitschriftenartikel |
---|---|
Fakultät: | Geschichts- und Kunstwissenschaften > Department Kunstwissenschaften > Kunstgeschichte |
Themengebiete: | 000 Informatik, Informationswissenschaft, allgemeine Werke > 004 Informatik
600 Technik, Medizin, angewandte Wissenschaften > 610 Medizin und Gesundheit |
ISSN: | 2522-5839 |
Sprache: | Englisch |
Dokumenten ID: | 107325 |
Datum der Veröffentlichung auf Open Access LMU: | 04. Okt. 2023, 13:58 |
Letzte Änderungen: | 28. Mai 2024, 13:24 |