Abstract
Driver's gaze direction is critical information in understanding driver state. In this paper, the authors present a distributed camera framework to estimate driver's coarse gaze direction using both head and eye cues. Coarse gaze direction is often sufficient in a number of applications, however, the challenge is to estimate gaze direction robustly in naturalistic real-world driving. Towards this end, the authors propose gaze-surrogate features estimated from eye region via eyelid and iris analysis. They present a novel iris detection computational framework. The authors are able to extract proposed features robustly and determine driver's gaze zone effectively. They evaluated the proposed system on a dataset, collected from naturalistic on-road driving in urban streets and freeways. A human expert annotated driver's gaze zone ground truth using information from the driver's eyes and the surrounding context. The authors conducted two experiments to compare the performance of the gaze zone estimation with and without eye cues. The head-alone experiment has a reasonably good result for most of the gaze zones with an overall 79.8% of weighted accuracy. By adding eye cues, the experimental result shows that the overall weighted accuracy is boosted to 94.9%, and all the individual gaze zones have a better true detection rate especially between the adjacent zones. Therefore, the experimental evaluations show efficacy of the proposed features and very promising results for robust gaze zone estimation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.