Abstract

This paper proposes a method for obtaining driver’s fixation points and establishing a preview model based on actual vehicle tests. Firstly, eight drivers were recruited to carry out the actual vehicle test on the actual straight and curved roads. The curvature radii of test curved roads were selected to be 200, 800, and 1500 m. Subjects were required to drive at a speed of 50, 70 and 90 km/h, respectively. During the driving process, eye movement data of drivers were collected using a head-mounted eye tracker, and road front scene images and vehicle statuses were collected simultaneously. An image-world coordinate mapping model of the visual information of drivers was constructed by performing an image distortion correction and matching the images from the driving recorder. Then, fixation point data for drivers were accordingly obtained using the Identification-Deviation Threshold (I-DT) algorithm. In addition, the Jarque–Bera test was used to verify the normal distribution characteristics of these data and to fit the distribution parameters of the normal function. Furthermore, the preview points were extracted accordingly and projected into the world coordinate. At last, the preview data obtained under these conditions are fit to build general preview time probability density maps for different driving speeds and road curvatures. This study extracts the preview characteristics of drivers through actual vehicle tests, which provides a visual behavior reference for the humanized vehicle control of an intelligent vehicle.

Highlights

  • The driver is the main component in a driver-vehicle-road closed-loop control system

  • From the eye movement data and image information collected using the eye tracker, we obtained more accurate fixation points through a series of image processing steps including distortion correction of the image collected by the eye tracker and matching with the image collected by the fixed position driving recorder to reduce the impact of head movements of drivers

  • Parameter values w y00 of eachdeviation normal function can be fitted distribution of fixation points, and is the ordinate of the distribution center of fixation points. of from the statistical distribution chart. w is the standard deviation of the longitudinal distribution fixation points, and y0 is the ordinate of the distribution center of fixation points

Read more

Summary

Introduction

The driver is the main component in a driver-vehicle-road closed-loop control system. This study of driver visual control used an experimental simulator to collect eye movement behavior data of drivers. From the eye movement data and image information collected using the eye tracker, we obtained more accurate fixation points through a series of image processing steps including distortion correction of the image collected by the eye tracker and matching with the image collected by the fixed position driving recorder to reduce the impact of head movements of drivers.

Related Works
Apparatus
Subjects
Conditions
Process
Fixation Point Acquisition
Fixation
Separation
Invalid Data Elimination
Fixation Point Distribution point
Parameter
10. Fixation
12. Relationship
Coordinate Transformation of Preview Point
Preview Time Model
Model Validation
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call