Abstract

We tested whether head-movements under automated driving can be used to classify a vehicle occupant as either situation-aware or unaware. While manually cornering, an active driver’s head tilt correlates with the road angle which serves as a visual reference, whereas an inactive passenger’s head follows the g-forces. Transferred to partial/conditional automation, the question arises whether aware occupant’s head-movements are comparable to drivers and if this can be used for classification. In a driving-simulator-study (n=43, within-subject design), four scenarios were used to generate or deteriorate situation awareness (manipulation checked). Recurrent neural networks were trained with the resulting head-movements. Inference statistics were used to extract the discriminating feature, ensuring explainability. A very accurate classification was achieved and the mean side rotation-rate was identified as the most differentiating factor. Aware occupants behave more like drivers. Therefore, head-movements can be used to classify situation awareness in experimental settings but also in real driving.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call