Abstract

Remote sensing image fusion based on the detail injection scheme consists of two steps: spatial details extraction and injection. The quality of the extracted spatial details plays an important role in the success of a detail injection scheme. In this paper, a remote sensing image fusion method based on adaptively weighted joint detail injection is presented. In the proposed method, the spatial details are first extracted from the multispectral (MS) and panchromatic (PAN) images through a trous wavelet transform and multiscale guided filter. Different from the traditional detail injection scheme, the extracted details are then sparsely represented to produce the primary joint details by dictionary learning from the subimages themselves. To obtain the refined joint details information, we subsequently design an adaptive weight factor considering the correlation and difference between the previous joint details and PAN image details. Finally, the refined joint details are injected into the MS image using modulation coefficient to achieve the fused image. The proposed method has been tested on QuickBird, IKONOS, and WorldView-2 datasets and compared to several state-of-the-art fusion methods in both subjective and objective evaluations. The experimental results indicate that the proposed method is effective and robust to images from various satellites sensors.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.