Abstract
ABSTRACT In recent years, deep learning (DL) pansharpening methods have become the most popular solution for acquiring high-resolution multispectral (HRMS) images. However, most DL-based methods design the task as black-box network architecture, ignoring the physical meaning of individual network modules. Such a network structure is difficult to extract sufficient spatial details or lacks interpretability. To address this problem, we propose a neural network pansharpening method with interpretable deep spatial detail injection based on the assumption of spectral consistency and double spatial detail priors in HRMS images. First, we assume that the HRMS image consists of a spectral and spatial component together. Spectral and partial spatial details are obtained by upsampling the multispectral (MS) images. Next, a variational model is constructed for image fusion from the LRMS image and the panchromatic (PAN) image to the HRMS image.The alternating direction multiplier method (ADMM) algorithm is then used to optimally solve the variational model to obtain the spatial detail that needs to be injected into the MS image. Finally, the model optimization solving process is expanded into a corresponding neural network structure using the idea of deep unrolling. Each network module corresponds to the solving step of the iterative algorithm. Our proposed pansharpening method is therefore interpretable in terms of spatial detail injection. In our experiments, we find that the proposed method achieves an optimal balance between spectral and spatial quality, and has a strong generalization capability over different datasets, demonstrating the superiority of the proposed pansharpening method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.