Abstract
Infrared and visible image fusion is an important image enhancement technology, aiming to generate high-quality fused images with prominent targets and rich textures in extreme environments. However, most existing image fusion methods are designed for infrared and visible images under normal lighting. At night, due to severe degradation of visible images, existing fusion methods have deficiencies in texture details and visual perception, which affects subsequent visual applications. To this end, this paper proposes a three-discriminator infrared and visible image fusion method based on GAN network. Specifically, this method adds an illumination enhancement discriminator based on the GAN-based dual discriminator fusion network. The input of this discriminator is the fused image generated by the generator and the low-light enhanced visible light image. By fighting in the third discriminator, it is ensured that the fused image output by the generator achieves the expected effect on the brightness information. In addition, this method also proposes a compensation attention module to convey the multi-scale features extracted by the feature extraction network and ensure that the fused image contains important detailed texture information. Compared with other fusion methods on public data sets such as MSRS, M3FD, Roadscence and TNO, the fusion results of this paper perform better in both quantitative measurement and qualitative effects. It also performs better in enhancing the brightness information.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.