Abstract
Deficits in spatial memory, orientation, and navigation are often early or neglected signs of degenerative and vestibular neurological disorders. A simple and reliable bedside test of these functions would be extremely relevant for diagnostic routine. Pointing at targets in the 3D environment is a basic well-trained common sensorimotor ability that provides a suitable measure. We here describe a smartphone-based pointing device using the built-in inertial sensors for analysis of pointing performance in azimuth and polar spatial coordinates. Interpretation of the vectors measured in this way is not trivial, since the individuals tested may use at least two different strategies: first, they may perform the task in an egocentric eye-based reference system by aligning the fingertip with the target retinotopically or second, by aligning the stretched arm and the index finger with the visual line of sight in allocentric world-based coordinates similar to using a rifle. The two strategies result in considerable differences of target coordinates. A pilot test with a further developed design of the device and an app for a standardized bedside utilization in five healthy volunteers revealed an overall mean deviation of less than 5 degrees between the measured and the true coordinates. Future investigations of neurological patients comparing their performance before and after changes in body position (chair rotation) may allow differentiation of distinct orientational deficits in peripheral (vestibulopathy) or central (hippocampal or cortical) disorders.
Dokumententyp: | Zeitschriftenartikel |
---|---|
Fakultät: | Medizin
Medizin > Munich Cluster for Systems Neurology (SyNergy) |
Themengebiete: | 600 Technik, Medizin, angewandte Wissenschaften > 610 Medizin und Gesundheit |
URN: | urn:nbn:de:bvb:19-epub-106368-8 |
ISSN: | 0340-5354 |
Sprache: | Englisch |
Dokumenten ID: | 106368 |
Datum der Veröffentlichung auf Open Access LMU: | 11. Sep. 2023, 13:37 |
Letzte Änderungen: | 06. Jun. 2024, 16:19 |
DFG: | Gefördert durch die Deutsche Forschungsgemeinschaft (DFG) - 491502892 |
DFG: | Gefördert durch die Deutsche Forschungsgemeinschaft (DFG) - 390857198 |