Abstract

Infrared and visible image fusion could obtain the fused image with infrared radiation information and visible texture information. Various fusion methods for infrared and visible image fusion have been developed in recent years. However, the fused images generated by most existing fusion methods suffer losing of detailed information, fusion noise, low contrast effect and blurred edge of fused objects. To obtain the fused image with excellent detailed source image information preservation capacity, low fusion noise level, high contrast effect and sharp edge of fused objects, this study proposed an infrared and visible image fusion network with composite auto encoder and transformer–convolutional parallel mixed fusion strategy (TCPMFNet). The proposed fusion network is based on auto-encoder (AE) structure, a composite auto encoder is adopted to encode abundant features form source image pairs, the transformer–convolutional parallel mixed fusion strategy is designed to achieve outstanding feature fusion performance, and developed a mesh connection decoder to exploit the potential of reconstructing fused image in a fully utilizing of multi-level features manner. Ablation studies and comparison experiments have been conducted on the TNO test set, and the experimental results demonstrated the effectiveness of the proposed network structure and the superiority of the proposed network over other state of the arts fusion methods. Furthermore, comparison experiments dedicated to infrared and RGB image fusion are conducted on Road scenes test set, and the infrared and RGB image fusion results of the proposed network outperformed other state of the arts fusion methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.