Abstract
Driver state analysis is considered as a potential application of computer vision. Facial images contain important information that enable recognition of the states of a driver. Unfortunately, the information hidden in facial images is imperfect and varies with the external environments. Modeling the relationship between the face information and driver’s state plays an essential role in driver fatigue detection. In this work, facial sequences are aligned and normalized, following which, a few fixed observation areas related to the fatigue expressions are extracted. Some discriminative features are extracted to represent facial states from these areas. A single image does not contain enough information to reflect fatigue expressions, hence a sequence of face images are exploited for fatigue detection using a sliding window. Thus, both static and sequential information are used to represent the states of a driver. An algorithm is designed to evaluate the quality of the extracted candidate features. Each area only contains partial information for state recognition, and merely provides a single view of the evidence for driver state recognition. We built base models with the information extracted from some specific facial areas, and integrated these to recognize the states of the driver. Experimental results show that these base models can offer complementary information for accurately identifying the facial status, and the integrated model shows good performance in driver state analysis.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.