Abstract

In recent years, the object detection technology based on deep learning has made great breakthroughs, greatly improving the detection accuracy. However, most of the existing deep learning detection models are designed for multi-class object detection in natural scenes, which may lead to over-fitting when applied in structured specific railway scenes. Secondly, in order to meet the real-time detection requirements of high-speed comprehensive detection train with a speed of 350 km/h, the detection speed is put forward with extremely high requirements, and the existing deep learning model is difficult to meet the timeliness of high-speed detection. In this paper, we propose an optimized structured regions fully convolutional Networks (SR-FCN), which change the multiple small objects detection problem into single structured region location problem. The structured prior information of rail track is fused into the various processes of deep learning network including that sample construction, proposal region generation, network building and loss function constraint. By optimizing the regional proposal network as well as the anchor’s traversal number, the locating speed of the railway objects is greatly improved, and the locating error caused by local missing or background interference is avoided, which improves the robustness of detection. The experimental results show that the proposed SR-FCN network can not only achieve a high detection accuracy up to 99.99%, but also maintain a fast detection speed, which can meet the real-time detection at the high speed of 350 km/h.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.