Abstract
The current driver's monitoring system requires a set-up that includes the usage of a variety of camera equipment behind the steering wheel. It is highly impractical in a real-world environment as the set-up might cause annoyance or inconvenience to the driver. This project proposes a framework of using mobile devices and cloud services to monitor the driver's head pose, detect angry expression and drowsiness, and alerting them with audio feedback. With the help of a phone camera functionality, the driver’s facial expression data can be collected then further analyzed via image processing under the Microsoft Azure platform. A working mobile app is developed, and it can detect the head pose, angry emotion, and drowsy drivers by monitoring their facial expressions. Whenever an angry or drowsy face is detected, pop-up alert messages and audio feedback will be given to the driver. The benefit of this mobile app is it can remind drivers to drive calmly and safely until drivers manage to handle their emotions where anger or drowsy is no longer detected. The performance of the mobile app in classifying anger emotion is achieved at 96.66% while the performance to detect driver’s drowsiness is 82.2%. On average, the head pose detection success rate across the six scenarios presented is 85.67%.
Accepted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have