Abstract

In this paper, a new infrared and visible image fusion method based on non-subsampled contourlet transform (NSCT) and convolutional sparse representation (CSR) is proposed to overcome defects in selecting the NSCT decomposition level, detail blur for the SR-based method, and low contrast for the CSR-based method. In the proposed method, NSCT is performed on source images to obtain the low-frequency NSCT approximation components and high- frequency NSCT detail components. Then, low-frequency NSCT approximation components are merged with the CSR-based method while the popular “max-absolute” fusion rule is applied for the high-frequency NSCT detail components. Finally, the inverse NSCT is performed over the low-pass fused result and high-pass fused components to obtain the final fused image. Three representative groups of infrared and visible images were used for fusion experiments to evaluate the proposed algorithm. More specifically, on the popular Leaves image, the objective evaluation metrics Qabf, Qe, and Qp of the proposed method were 0.7050, 0.6029, and 0.7841, respectively; on the Quad image, Qabf, Qe, and Qp were 0.6527, 0.4843, and 0.5169, respectively; and on the Kayak image, Qabf, Qe, and Qp were 0.6882, 0.4470, and 0.5532, respectively. Compared with the fusion method based on NSCT and sparse representation, the objective evaluation metrics Qabf, Qe, and Qp showed increases of 1.54%, 10.57%, 22.49% on average. These experimental results demonstrate that the proposed fusion algorithm provides state-of-the-art performance in terms of subjective visual effects and objective evaluation criteria.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.