Logo Logo
Hilfe
Hilfe
Switch Language to English

Hu Fleischhauer, Yanfei ORCID logoORCID: https://orcid.org/0009-0002-9809-8159; Surale, Hemant Bhaskar ORCID logoORCID: https://orcid.org/0000-0001-6616-2895; Alt, Florian ORCID logoORCID: https://orcid.org/0000-0001-8354-2195 und Pfeuffer, Ken ORCID logoORCID: https://orcid.org/0000-0002-5870-1120 (2023): Gaze-based Mode-Switching to Enhance Interaction with Menus on Tablets. 15th Annual ACM Symposium on Eye Tracking Research and Applications (ETRA), Tubingen, Germany, 30. Mai - 02. Juni 2023. Kasneci, Enkelejda; Shic, Frederick und Khamis, Mohamed (Hrsg.): In: 2023 Symposium on Eye Tracking Research and Applications, 7 New York: Association for Computing Machinery. S. 1-8

Volltext auf 'Open Access LMU' nicht verfügbar.

Abstract

In design work, a common task is the interaction with menus to change the drawing mode. Done frequently, this can become a tedious and fatiguing task, especially for tablets where users physically employ a stylus or finger touch. As our eyes are naturally involved in visual search and acquisition of desired menu items, we propose gaze to shortcut the physical movement. We investigate gaze-based mode-switching for menus in tablets by a novel mode-switching methodology, assessing a gaze-only (dwell-time) and multimodal (gaze and tap) technique, compared to hand-based interaction. The results suggest that users can efficiently alternate between manual and eye input when interacting with the menu; both gaze-based techniques have lower physical demand and individual speed-error trade-offs. This led to a novel technique that substantially reduces time by unifying mode-selection and mode-application. Our work points to new roles for our eyes to efficiently short-cut menu actions during the workflow.

Dokument bearbeiten Dokument bearbeiten