Abstract

Working under fatigue states is not only inefficient but also brings a series of safety concerns and health problems. This article presents a novel fatigue detection algorithm based on facial multifeature fusion, exhibiting promising properties of immediacy and accuracy. Following the video processing of marking the gray image frames and the histogram equalization using the Dlib toolkit, the facial features are extracted based on the facial marker points and then evaluated to obtain the eye aspect ratio (EAR), mouth aspect ratio (MAR), and head Euler angles (HEAs) in real-time. These evaluation indexes can further contribute to calculating blinking frequency (BF), percentage of eyelid closure (PERCLOS) over the pupil over time, yawning frequency (YF), and nodding frequency (NF), of which the four parameters are normalized to establish the detection model, showing the capability of identifying the fatigue grade with high accuracy and quick response. The actual test verifies the algorithm’s reliability, and the results show that the accuracy of detecting the fatigue behaviors reaches more than 94.4%, and the final judgment perfectly matches the actual physiological state based on ensuring the real-time performance of the detection.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call