Abstract
This paper presents a novel method for the underwater crack detection by fuse the optical and texture information. Underwater crack images were influenced by the harsh underwater environment, lead to making crack detection challenging, especially difficult to capturing the details of cracks accurately. To improve detection accuracy, it was necessary to obtain more feature information related to cracks. Therefore, this paper proposes a dual-input branch semantic segmentation model to achieve the fusion of optical and texture information, and introduces the Convolutional Block Attention Module (CBAM) module to enhance the performance of the semantic segmentation model. The optimal network architecture was determined by selecting the backbone network and optimizer, and a custom Tversky loss function was introduced to make the semantic segmentation model pay more attention to the crack area. The results show that the detection accuracy, IoU, and F1-Score can reach 96.07 %, 0.95, and 0.96 respectively. Through multiple comparative experiments, the effectiveness of the proposed method was validated, particularly compared to the non-fused texture information method, where the accuracy, IoU, and F1-Score were increased by 3.30 %, 6.74 %, and 7.88 % respectively. Finally, by visualizing the variation pattern of cracks in the detection model, the operational mechanism of the proposed method was explained. This confirms that the proposed method significantly improves the accuracy of underwater crack detection, and provides a novel approach for the underwater defect detection.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.