Abstract

The goal of infrared and visible image fusion is to synthesize an image that contains complementary information as much as possible in paired infrared and visible images. However, due to using the convolution operation with local receptive field to extract features, most of the existing deep learning-based fusion methods fail to effectively exploit global contextual information in source images and so obtain a limited fusion performance. To address this issue, this paper proposes an end-to-end skip-connecting group convolutional attention network, termed as SCGAFusion, to fuse infrared and visible images. In SCGAFusion, a group convolutional attention block (GAB) is purposefully developed to promote the extraction and utilization capability of the fusion network for informative features. GAB introduces group convolutional layer to obtain hierarchical features, and employs a residual non-local attention module to capture long-range dependencies between pixels. In this way, it can not only focus more on the important regions or details, but also leverage both local neighborhood and global contextual information. Additionally, a hierarchical feature compensation mechanism based on skip connections is devised to integratedly exploit both local and global features. Experimental results on several public datasets qualitatively and quantitatively demonstrate the advantage of SCGAFusion over other state-of-the-art fusion methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.