Abstract

In the United States, highway-railroad grade crossings are easily congested, which not only causes significant traffic delays to travelers but also brings potential threats to the first responders for emergencies. Unfortunately, very limited research efforts have been dedicated to developing practical systems that can assess traffic conditions at overcrowded grade crossings. The main challenge in evaluating the congestion conditions at the crossings is the different instance classes (i.e., vehicle, train, and pedestrian) that need to be accurately detected, especially when densely packaged. In this study, a novel convolutional neural network (CNN) named dense traffic detection net (DTDNet) is developed. DTDNet proposes to integrate the Transformer Attention (TA) module for better modeling of global context information and the learning-to-match detection head for optimizing object detection and localization using a likelihood probability fashion. To train and test DTDNet, a unique grade crossing traffic image dataset including congested and normal traffic during both daytime and nighttime is established. Experimental results on the dataset show that the proposed DTDNet achieves the maximum mean average precision (mAP) value, 0.832, outperforming the other state-of-the-art (SOTA) models. Field test results with low mean average error (MAE), mean relative error (MRE), and root mean squared error (RMSE) which are 2.200, 1.890, and 0.280, respectively suggest the proposed model has a satisfying and robust performance in the field application under different environments.”.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.