Abstract

This paper presents a data fusion method for the on-board detection of driver drowsiness in real time. Multiple sensors including camera to capture the driver's eye status, angle sensor to measure the driver's steering behavior, and clock to indicate the time on task were implemented. A data fusion framework based on Dempster-Shafer theory is built for modeling and combining the pieces of evidence, and to generate an overall inference of the driver's drowsiness level. The method has been validated in an experiment on a driving simulator. The results suggest that the data fusion process could reduce the uncertainty in the drowsiness inference and obtain a better system performance compared with any single sensor.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call