Abstract

During the acquisition of infrared images in substations, low-quality images with poor contrast, blurred details, and missing texture information frequently appear, which adversely affects subsequent advanced visual tasks. To address this issue, this paper proposes an infrared image enhancement algorithm for substation equipment based on a self-attention cycle generative adversarial network (SA-CycleGAN). The proposed algorithm incorporates a self-attention mechanism into the CycleGAN model’s transcoding network to improve the mapping ability of infrared image information, enhance image contrast, and reducing the number of model parameters. The addition of an efficient local attention mechanism (EAL) and a feature pyramid structure within the encoding network enhances the generator’s ability to extract features and texture information from small targets in infrared substation equipment images, effectively improving image details. In the discriminator part, the model’s performance is further enhanced by constructing a two-channel feature network. To accelerate the model’s convergence, the loss function of the original CycleGAN is optimized. Compared to several mainstream image enhancement algorithms, the proposed algorithm improves the quality of low-quality infrared images by an average of 10.91% in color degree, 18.89% in saturation, and 29.82% in feature similarity indices. Additionally, the number of parameters in the proposed algorithm is reduced by 37.89% compared to the original model. Finally, the effectiveness of the proposed method in improving recognition accuracy is validated by the Centernet target recognition algorithm.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.