Abstract

Traffic crashes typically occur in a few seconds and real-time prediction can significantly benefit traffic safety management and the development of safety countermeasures. This paper presents a novel deep learning model for crash identification based on high-frequency, high-resolution continuous driving data. The method consists of feature engineering based on Convolutional Neural Network (CNN) and Gated Recurrent Unit (GRU) and classification based on Extreme Gradient Boosting (XGBoost). The CNN-GRU architecture captures the time series characteristics of driving kinematics data. Compared to normal driving segments, safety-critical events (SCEs)—i.e., crashes and near-crashes (CNC)—are rare. The weighted categorical cross-entropy loss and oversampling methods are utilized to address this imbalance issue. An XGBoost classifier is utilized instead of the multi-layer perceptron (MLP) to achieve a high precision and recall rate. The proposed approach is applied to the Second Strategic Highway Research Program Naturalistic Driving Study (SHRP 2 NDS) data with 1,820 crashes, 6,848 near-crashes, and 59,997 normal driving segments. The results show that in a 3-class classification system (crash, near-crash, normal driving segments), the accuracy for the overall model is 97.5%, and the precision and recall for crashes are 84.7%, and 71.3% respectively, which is substantially better than benchmarks models. Furthermore, the recall of the most severe crashes is 98.0%. The proposed crash identification approach provides an accurate, highly efficient, and scalable way to identify crashes based on high frequency, high-resolution continuous driving data and has broad application prospects in traffic safety applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.