Logo Logo
Switch Language to German
Dari, Simone; Kadrileev, Nikolay; Hüllermeier, Eyke ORCID: 0000-0002-9944-4108 (July 2020): A Neural Network-Based Driver Gaze Classification System with Vehicle Signals. 2020 International Joint Conference on Neural Networks (IJCNN), 19-24 July 2020, Glasgow, UK.
Full text not available from 'Open Access LMU'.


Driver monitoring can play an essential part in avoiding accidents by warning the driver and shifting the driver's attention to the traffic scenery in time during critical situations. This may apply for the different levels of automated driving, for take-over requests as well as for driving in manual mode. A great proxy for this purpose has always been the driver's gazing direction. The aim of this work is to introduce a robust gaze detection system. In this regard, we make several contributions that are novel in the area of gaze detection systems. In particular, we propose a deep learning approach to predict gaze regions, which is based on informative features such as eye landmarks and head pose angles of the driver. Moreover, we introduce different post-processing techniques that improve the accuracy by exploiting temporal information from videos and the availability of other vehicle signals. Last but not least, we confirm our method with a leave-one-driver-out cross-validation. Unlike previous studies, we do not use gazes to predict maneuver changes, but we consider the human-computer-interaction aspect and use vehicle signals to improve the performance of the estimation. The proposed system is able to achieve an accuracy of 92.3% outperforming earlier landmark-based gaze estimators.