Abstract

Remote sensing image fusion extracts the spatial information of the panchromatic (PAN) image to sharpen the geometric structure of a multi-spectral (MS) image. Traditional algorithms that solve the fusion image problem by applying various transformations often result in some losses of spatial and spectral details. To improve the quality of the fusion result, we develop a novel fusion method based on collaborative representation for multi-band remote sensing images. In the developed collaborative representation model, a spectral preservation coefficient based on the spectral contribution and spectral-spatial dependency is designed to retain the spectral in the low-resolution MS (LRMS) image. An intensity modulation coefficient based on the spatial difference between the PAN and MS images that is spectral dependent is designed to adaptively recover and modulate the spatial of the MS image. Through the proposed collaborative representation model, the LRMS, low-resolution PAN, and PAN images, and the designed coefficients collaboratively represent the fusion image. The results of the experiment on various satellite datasets show that the method we develop is effective and robust in enhancing pansharpening.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call