Abstract

Substation equipment defect detection has always played an important role in equipment operation and maintenance. However, the task scenarios of substation equipment defect detection are complex and different. Recent studies have revealed issues such as a significant missed detection rate for small-sized targets and diminished detection precision. At the same time, the current mainstream detection algorithms are highly complex, which is not conducive to deployment on resource-constrained devices. In view of the above problems, a small target and lightweight substation main scene equipment defect detection algorithm is proposed: Efficient Attentional Lightweight-YOLO (EAL-YOLO), which detection accuracy exceeds the current mainstream model, and the number of parameters and floating point operations (FLOPs) are also advantageous. Firstly, the EfficientFormerV2 is used to optimize the model backbone, and the Large Separable Kernel Attention (LSKA) mechanism has been incorporated into the Spatial Pyramid Pooling Fast (SPPF) to enhance the model's feature extraction capabilities; secondly, a small target neck network Attentional scale Sequence Fusion P2-Neck (ASF2-Neck) is proposed to enhance the model's ability to detect small target defects; finally, in order to facilitate deployment on resource-constrained devices, a lightweight shared convolution detection head module Lightweight Shared Convolutional Head (LSCHead) is proposed. Experiments show that compared with YOLOv8n, EAL-YOLO has improved its accuracy by 2.93 percentage points, and the mAP50 of 12 types of typical equipment defects has reached 92.26%. Concurrently, the quantity of FLOPs and parameters has diminished by 46.5% and 61.17% respectively, in comparison with YOLOv8s, meeting the needs of substation defect detection.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.