Abstract
AbstractRail wear occurs continuously owing to the rolling contact load of trains and is fundamental for railway operational safety. A point‐based manual rail wear inspection cannot satisfy the increasing demand for rapid, low‐cost, and continuous monitoring. This paper proposes a depth‐plus‐region fusion network for detecting rail wear on a running band, which is a collection of wheel–rail interaction traces. The following steps are involved in the proposed method. (i) A depth map estimated by a modified MiDaS model is utilized as guidance for exploiting the depth information of the running band for rail wear detection. (ii) The running band of a rail is segmented and extracted from images using an improved mask region‐based convolutional neural network that uses the scale and ratio information to perform instance segmentation of the running band images. (iii) A two‐channel attention–fusion network that classifies rail wear is constructed. In this study, we collected real‐world running band images and rail wear‐related data to validate our approach using a high‐accuracy rail‐profile measurement tool. The case‐study results demonstrated that the proposed method can rapidly and accurately detect rail wear under different ambient light conditions. Moreover, the recall rate of severe wear detection was 84.21%.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Computer-Aided Civil and Infrastructure Engineering
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.