Abstract

Monitoring driver behavior is crucial in the design of advanced driver assistance systems (ADAS) that can detect driver actions, providing necessary warnings when not attentive to driving tasks. The visual attention of a driver is an important aspect to consider, as most driving tasks require visual resources. Previous work has investigated algorithms to detect driver visual attention by tracking the head or eye movement. While tracking pupil can give an accurate direction of visual attention, estimating gaze on vehicle environment is a challenging problem due to changes in illumination, head rotations, and occlusions (e.g. hand, glasses). Instead, this paper investigates the use of the head pose as a coarse estimate of the driver visual attention. The key challenge is the non-trivial relation between head and eye movements while glancing to a target object, which depends on the driver, the underlying cognitive and visual demand, and the environment. First, we evaluate the performance of a state-of-the-art head pose detection algorithm over natural driving recordings, which are compared with ground truth estimations derived from AprilTags attached to a headband. Then, the study proposes regression models to estimate the drivers' gaze based on the head position and orientation, which are built with data from natural driving recordings. The proposed system achieves high accuracy over the horizontal direction, but moderate/low performance over the vertical direction. We compare results while our participants were driving, and when the vehicle was parked.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call