Abstract
Pansharpening is an image fusion procedure, which aims to produce a high spatial resolution multispectral image by combining a low spatial resolution multispectral image and a high spatial resolution panchromatic image. The most popular and successful paradigm for pansharpening is the framework known as detail injection, while it cannot fully exploit complex and non-linear complementary features of both images. In this paper, we propose a detail injection model inspired deep fusion network for pansharpening (DIM-FuNet). Firstly, by treating pansharpening as a complicated and non-linear details learning and injection problem, we establish a unified optimizing detail-injection model with triple detail fidelity terms: 1) a band-dependent spatial detail fidelity term, 2) a local detail fidelity term and 3) a complicated details synthesis term. Secondly, the model is optimized via the iterative gradient descent and unfolded into a deep convolutional neural network. Subsequently, the unrolling network has triple branches, in which, a point-wise convolutional sub-network, a depth-wise convolutional sub-network are corresponding to the former two detail constrained terms, and an adaptive weighted reconstruction module with a fusion sub-network to aggregate details of two branches and synthesis the final complicated details. Finally, the deep unrolling network is trained in end-to-end manners. Different from traditional deep fusion networks, the architecture design of DIM-FuNet is guided by the optimizing model and thus promotes better interpretability. Experimental results on reduced and full-resolution demonstrate the effectiveness of the proposed DIM-FuNet which achieves the best performance compared with the state-of-the-art pansharpening method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.