Abstract

Driver fatigue is an important factor in many transportation accidents. Therefore, detecting driver fatigue is extremely important to improving transportation safety. When a driver fatigues, he will take many special visual cues on his face. In this paper, we combine visual cues from mouths and eyes systematically which characterize eye closed and yawning to infer the fatigue level of a driver. AdaBoost is used to extract the most discriminative features from the local binary pattern (LBP) features of eye areas and constructs a highly accurate classifier to get the eye visual cue. Yawning is an important evidence of driver fatigue. We detect driver's left and right mouth corners by gray projection, and extract texture features of driver's mouth corners using Gabor wavelets, and finally the mouth visual cue is extracted by using LDA to classify Gabor features. A probabilistic method based on Bayesian networks (BN) is used to fuse the two visual cues at the confidence level for fatigue detection. The proposed method has been tested on wide range of human subjects under real-life fatigue conditions of different genders, poses and illumination conditions. It yields a much more robust, reliable and accurate fatigue detection than using a single visual cue. The test data includes 4800 images from thirty people's videos, and the average recognition rate of the proposed method is 96.79%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call