Abstract
Driver fatigue and lethargy are widely acknowledged as significant contributors to the occurrence of vehicular accidents. During the preceding decade, there were a total of 66,000 motor vehicle accidents, resulting in 22,952 fatalities and 79,545 injuries. It is of the utmost importance to implement measures designed to notify drivers in order to prevent accidents and save lives. The purpose of this research was to develop and implement a Driving Drowsiness Detection System (DDDS) in order to increase road safety. The system employs eye and mouth movements as indicators in order to detect fatigue and reduce the likelihood of accidents. Using a high-resolution camera and Deep Cascaded Convolutional Neural Networks (DCCNN) to accomplish accurate drowsiness detection was the objective of this research. This study's methodology entails the analysis of driver behavior, with a particular emphasis on eye movements captured by a high-resolution camera. The facial region was identified using landmarks extracted from the Dlib toolset, and the "Eyes Aspect Ratio" metric was developed to quantify the level of fatigue. Deep Convolutional Neural Network (DCCNN) output triggers an alarm on the dashboard of the graphical user interface (GUI). The study utilized an image with 450x320 pixels and a frame rate of 60 frames per second. In ideal lighting conditions, drowsy detection accuracy exceeded 99.9%, while in less than ideal lighting conditions, it surpassed 99.8%. Compared to previous research, the present study demonstrated a higher level of accuracy in detecting drowsiness.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Iraqi Journal for Computer Science and Mathematics
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.