Abstract

Drought-induced soil desiccation cracking has attracted great attention in various disciplines with the advent of global climate change. Accurately obtaining soil crack networks is essential to understand the cracking mechanism. Inspired by recent advances of artificial intelligence (AI) in computer vision, we propose a new automatic soil crack recognition method based a novel network architecture, named Attention Res-UNet. Deep Res-UNet inherits both the advantages from residual learning for training deeper networks and U-Net for semantic segmentation. Moreover, attention mechanism is utilized to alleviate the influences caused by the uneven illumination conditions. Firstly, the soil crack images under different uneven illumination conditions are collected to create a new soil cracking image dataset. Then, traditional method and multiple state-of-the-art deep learning based different semantic segmentation models are tested on our collected dataset. Finally, a professional evaluation standard, which considers both the overall metrics (precision, recall, dice, surface crack ratio) and details (crack total length, average crack width, number of crack segments) of the soil crack features is proposed to evaluate the recognition results of the different models. Extensive experimental results demonstrate the superiority of our proposed Attention Res-UNet approach compared with traditional methods and other deep learning models in recognizing soil cracks under complex environmental conditions. Our method is also suitable for crack recognition of other materials under complex environmental conditions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.