Abstract
Unmanned aerial systems (UASs) are increasingly applied for bridge inspection. A vision-guided UAS with a lightweight convolutional neural network is developed to detect and locate bridge cracks, spalling, and corrosion. The contributions are as follows: (1) To address the problem that traditional UASs are global positioning system (GPS) required while GPS signals under bridge bottom generally are weak. A vision-guided UAS is designed and applied, in which a stereo vision-inertial fusion method is used to provide position data instead of GPS and an ultrasonic ranger is applied to avoid obstacles. (2) Most of the deep learning-based damage detection methods are offline detection, which is unsuitable for UAS-based inspection because the endurance time is limited. To solve this problem, a lightweight end-to-end object detection network is proposed, by replacing the backbone of the original You Only Look Once v3 network with MobileNetv2, and the proposed network of much faster inference speed can be transplanted to the onboard computer of the designed UAS so that real-time edge computing can be performed during inspection. (3) A damage location method based on vision positioning data and simultaneous localization and mapping is also proposed to meet the urgent needs of locating damage in the whole structure. Finally, the proposed system is applied to inspect a long-span bridge to detect and locate the most common damages: crack, spalling, and corrosion with high accuracy and efficiency, which verified the practicability of the system.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.