Abstract

In CIELab color space, we propose a remote sensing image fusion technique based on nonsubsampled shearlet transform (NSST) and pulse coupled neural network (PCNN), which aim to improve the efficiency and performance of the remote sensing image fusion by combining the excellent properties of the two methods. First, panchromatic (PAN) and multispectral (MS) are transformed into CIELab color space to get different color components. Second, PAN and L component of MS are decomposed by the NSST to obtain corresponding the low-frequency coefficients and high-frequency coefficients. Third, the low-frequency coefficients are fused by intersecting cortical model (ICM); the high-frequency coefficients are divided into several sub-blocks to calculate the average gradient (AG), and the linking strength β of PCNN model is determined by the AG, so that the parameters β can be adaptively set according to the quality of the sub-block images, then the sub-blocks image are input into PCNN to get the oscillation frequency graph (OFG), the method can get the fused high-frequency coefficients according to the OFG. Finally, the fused L component is obtained by inverse NSST, and the fused RGB color image is obtained through inverse CIELab transform. The experimental results demonstrate that the proposed method provide better effect compared with other common methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call