Abstract
The detection of cracks is essential for assessing and maintaining building and road safety. However, the large appearance variations and the complex topological structures of cracks bring challenges to automatic crack detection. To alleviate the above challenges, we propose a deep multi-scale crack feature learning model called DeepCrackAT for crack segmentation, which is based on an encoder–decoder network with feature tokenization mechanism and attention mechanism. Specifically, we use hybrid dilated convolutions in the first three layers of the encoder–decoder to increase the network’s receptive field and capture more crack information. Then, we introduce a tokenized multilayer perceptron (Tok-MLP) in the last two layers of the encoder–decoder to tokenize and project high-dimensional crack features into low-dimensional space. This helps to reduce parameters and enhance the network’s ability of noise resistance. Next, we concatenate the features corresponding to the encoder–decoder layers and introduce the convolutional block attention module (CBAM) to enhance the network’s perception of the critical crack region. Finally, the five-layer features are fused to generate a binary segmentation map of the crack image. We conducted extensive experiments and ablation studies on two real-world crack datasets, and DeepCrackAT achieved 97.41% and 97.25% accuracy on these datasets, respectively. The experimental results show that the proposed method outperforms the current state-of-the-art methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Engineering Applications of Artificial Intelligence
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.