Abstract

Periodic road crack monitoring is an essential procedure for effective pavement management. Highly efficient and accurate crack measurements are key research topics in both academia and industry. Automatic methods gradually replaced traditional manual surveys for more reliable evaluation outputs and better efficiency, whereas the devices are not available to all functional classes of pavements and different departments considering the high cost versus the limited budget. Recently, the widespread use of smartphones and digital cameras made it possible to collect pavement surface crack images at an affordable price in easier ways. However, the qualities of these crack images are diversely influenced by the noises from pavement background, roadways, and so forth. Thus, traditional methods usually fail to extract accurate crack information from pavement images. Therefore, this research proposes a state-of-the-art pixelwise crack detection architecture called CrackU-net, which is featured by its utilization of advanced deep convolutional neural network technology. CrackU-net achieved pixelwise crack detection through convolution, pooling, transpose convolution, and concatenation operations, forming the “U”-shaped model architecture. The model is trained and validated by 3,000 pavement crack images, in which 2,400 for training and 600 for validating, using the Adam algorithm. CrackU-net has the performance of loss = 0.025, accuracy = 0.9901, precision = 0.9856, recall = 0.9798, and F-measure = 0.9842 with learning rate of 10−2. Meanwhile, the false-positive crack detection problem is avoided in CrackU-net. Therefore, CrackU-net outperforms both traditional approaches and fully convolutional network (FCN) and U-net for pixelwise crack detections.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.