Abstract

Aiming at the existing problem of losing spatial details and spectral information in remote sensing image fusion, a method of remote sensing image fusion based on convolution sampling transformation is proposed. Firstly, two images to be fused are convoluted, sampled, filtered hierarchically, and decomposed into different sub-images at different levels. Next, those sub-images are fused according to the corresponding locations, and then reconstructed to be a fused image. For images with more complex terrains and objects, the requirements for spatial details and spectral information are higher. So we propose to further transform the fused image and panchromatic image in space. The first component of the former is replaced by that of the latter. Finally, the fused image is obtained after performing inverse transformation. The experimental results show that the fusion effect of two proposed methods is better than that of traditional image fusion algorithms, such as PCA transform, HIS transform, wavelet transform and so on. Compared with the previous method, the latter method has higher resolution but less spectral information in the case of more complex terrains and objects for two proposed methods. They are effective remote sensing image fusion methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call