Logo Logo
Hilfe
Hilfe
Switch Language to English

Pfeuffer, Ken ORCID logoORCID: https://orcid.org/0000-0002-5870-1120; Obernolte, Jan ORCID logoORCID: https://orcid.org/0000-0003-1191-7044; Dietz, Felix ORCID logoORCID: https://orcid.org/0000-0002-0241-0295; Mäkelä, Ville ORCID logoORCID: https://orcid.org/0000-0001-6095-2570; Sidenmark, Ludwig ORCID logoORCID: https://orcid.org/0000-0002-7965-0107; Manakhov, Pavel ORCID logoORCID: https://orcid.org/0000-0003-3443-4088; Pakanen, Minna ORCID logoORCID: https://orcid.org/0000-0002-0933-9479 und Alt, Florian ORCID logoORCID: https://orcid.org/0000-0001-8354-2195 (2023): PalmGazer: Unimanual Eye-hand Menus in Augmented Reality. 11th ACM Symposium on Spatial User Interaction (SUI), Sydney, Australia, 13. - 15. Oktober 2023. Huang, Tony (Hrsg.): In: Proceedings of the 2023 ACM Symposium on Spatial User Interaction, 10 New York: Association for Computing Machinery. S. 1-10

Volltext auf 'Open Access LMU' nicht verfügbar.

Abstract

How can we design the user interfaces for augmented reality (AR) so that we can interact as simple, flexible and expressive as we can with smartphones in one hand? To explore this question, we propose PalmGazer as an interaction concept integrating eye-hand interaction to establish a singlehandedly operable menu system. In particular, PalmGazer is designed to support quick and spontaneous digital commands– such as to play a music track, check notifications or browse visual media – through our devised three-way interaction model: hand opening to summon the menu UI, eye-hand input for selection of items, and dragging gesture for navigation. A key aspect is that it remains always-accessible and movable to the user, as the menu supports meaningful hand and head based reference frames. We demonstrate the concept in practice through a prototypical mobile UI with application probes, and describe technique designs specifically-tailored to the application UI. A qualitative evaluation highlights the system’s interaction benefits and drawbacks, e.g., that common 2D scroll and selection tasks are simple to operate, but higher degrees of freedom may be reserved for two hands. Our work contributes interaction techniques and design insights to expand AR’s uni-manual capabilities.

Dokument bearbeiten Dokument bearbeiten