Abstract

Distracted driving such as phone use during driving is risky, as it increases the probability of severe crashes. Detecting distraction using Naturalistic Driving Studies was attempted in existing studies, and most of them used facial motions, which would be highly influenced by light conditions and algorithm effectiveness, still could not fully indicate auditory and physical distractions. This study aims to optimize Long Short-Term Memory (LSTM) model for phone usage detection based on vehicle dynamics sensor data from Shanghai Naturalistic Driving Study (SH-NDS), China. A total of 1244 phone use events were extracted from videos of SH-NDS, and analyzed against focus driving baseline. Performance attributes included speed, longitudinal acceleration, lateral acceleration, lane offset, and steering wheel rate. Their mean, standard deviation, and predicted error (PE) were calculated, and derived 15 indicators. A Bidirectional layer and attention mechanism were added to the LSTM model for higher accuracy. Results showed that besides the mean and standard deviation of steering wheel rate, all the other 13 indicators were significant and effective in the model. The Bidirectional Long Short-Term Memory (Bi-LSTM) model reached a promising result of approximately 91.2% accuracy using 5-fold cross validation, which was better than other machine learning methods such as recurrent neural network, support vector machine, k-nearest neighbor, and adaptive boosting. This Bi-LSTM model with attention mechanism could potentially be applied in advanced driving assistant systems to warn driver and reduce phone involved distracted driving.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.