Abstract

Abstract. Roads are one of the essential transportation infrastructures that get damaged over time and affect economic development and social activities. Therefore, accurate and rapid recognition of road damage such as cracks is necessary to prevent further damage and repair it in time. The traditional methods for recognizing cracks are using survey vehicles equipped with various sensors, visual inspection of the road surface, and recognition algorithms in image processing. However, performing recognition operations using these methods is associated with high costs and low accuracy and speed. In recent years, the use of deep learning networks in object recognition and visual applications has increased, and these networks have become a suitable alternative to traditional methods. In this paper, the YOLOv4 deep learning network is used to recognize four types of cracks transverse, longitudinal, alligator, and oblique cracks utilizing a set of 2000 RGB visible images. The proposed network with multiple convolutional layers extracts accurate semantic feature maps from input images and classifies road cracks into four classes. This network performs the recognition process with an error of 1% in the training phase and 77% F1-Score, 80% precision, 80% mean average precision (mAP), 77% recall, and 81% intersection over union (IoU) in the testing phase. These results demonstrate the acceptable accuracy and appropriate performance of the model in road crack recognition.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.