Abstract

This paper introduces a reference model of glance behavior for driving safety assessment. This model can improve the design of automated and assistive systems. Technological limitations have previously hindered the use of unobtrusive eye trackers to measure glance behavior in naturalistic conditions. This paper presents a comprehensive analysis of eye-tracking data collected in a naturalistic field operation test, using an eye tracker that proved to be robust in real-world driving scenarios. We describe a post-processing technique to enhance the quality of naturalistic eye-tracker data, propose a data-analysis procedure that captures the important features of glance behavior, and develop a model of glance behavior (based on distribution fitting), which was lacking in the literature. The model and its metrics capture key defining characteristics of, and differences between, on- and off-path glance distributions, and during manual driving and driving with adaptive cruise control and lane keeping aid active. The results show that drivers’ visual response is tightly coupled to the driving context (vehicle automation, car-following, and illumination).

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.