Abstract

ABSTRACT Fast and accurate detection of airfield pavement damage is crucial to airport flight safety and airfield pavement maintenance. An efficient and lightweight detection algorithm that can be embedded into the mobile detection device has been in urgent demand. However, traditional Convolutional Neural Networks (CNNs) usually generate redundant feature maps during feature extraction or use extra operations during feature fusion to gain better performance, which greatly challenges the efficiency of the algorithm. We approached this issue by proposing an accurate and efficient detection algorithm, the YOLOv5-APD. The algorithm improves the model performance in two ways: Speeding up training and inferencing by using cheaper operations during feature extraction; Reducing the model complexity by removing redundant nodes during feature fusion. We verified the detection performance of YOLOv5-APD on a self-made dataset and compared it with the other state-of-the-art (SOTA) models. Then ablation experiments were carried out to investigate the effects of the proposed model design and the impact of image augmentation. Results showed that the proposed YOLOv5-APD model outperformed the SOTA algorithms in model performance and efficiency, which attained the optimal performance mean average precision (mAP) of 0.924. The proposed model also achieved the fastest inference speed of 142 frame-per-second (FPS), with a model footprint of 8.3 G FLOPs and 8 MB Parameters.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.