Abstract
This paper presents a practical method for analyzing drivers’ eye movements, providing a valuable tool for understanding their behavior during driving simulations. The method, which utilizes an image processing technique, addresses the challenges when the driver’s attention is on points without information about the image depth. The screen image changes or moves with the simulation. It allows us to identify the gaze position relative to the road, determining whether the glance is inside or outside. This is achieved by transforming RGB images (frames) collected by the eye-tracker video camera into a b/w image using the Canny filter. This filter can identify objects’ contours by evaluating the change in color of their surfaces. A window is then applied to these new images to extract information about the gaze position in the real world. Four drivers were used as a sample for the method’s testing. The findings demonstrate various driver variations and a disparity between driving in curved and rectilinear segments. The gaze is typically inside the road in curved sections, whereas in rectilinear sections, the gaze is frequently outside.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.