Abstract
AbstractRail surface defects (RSDs) are a major problem that reduces operation safety. Unfortunately, the existing RSD detection systems have very limited accuracy. Current image processing methods are not tailored for the railway track and many fully convolutional networks (FCN)‐based methods suffer from the blurry rail edges (RE). This paper proposes a new rail boundary guidance network (RBGNet) for salient RS detection. First, a novel architecture is proposed to fully utilize the complementarity between the RS and the RE to accurately identify the RS with well‐defined boundaries. The newly developed RBGNet injects high‐level RS object information into shallow RS edge features by a progressive fused way for obtaining fine edge features. Then, the system integrates the refined edge features with RS features at different high‐level layers to predict the RS precisely. Second, an innovative hybrid loss consisting of binary cross entropy (BCE), structural similarity index measure (SSIM), and intersection‐over‐union (IoU) is proposed and equipped into the RBGNet to supervise the network and learn the transformation between the input and ground truth. The input and ground truth then further refine the RS location and edges. Conveniently, an image‐based model for RSD detection and quantification is also developed and integrated for an automatic inspection purpose. Finally, experiments conducted on the complex unmanned aerial vehicle (UAV) rail dataset indicate the system can achieve a high detection rate with good adaptation capability in complicated environments.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Computer-Aided Civil and Infrastructure Engineering
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.