Abstract
Convolutional neural networks (CNNs) have recently achieved impressive improvements on hyperspectral (HS) pansharpening. However, most of the CNN-based HS pansharpening approaches would have to first upsample the low-resolution hyperspectral image (LR-HSI) using bicubic interpolation or data-driven training strategy, which inevitably lose some details or greatly rely on the learning process. In addition, most previous methods regard the pansharpening as a black-box problem and treat diverse features equally, thus hindering the discriminative ability of CNNs. To conquer these issues, a novel HS pansharpening method using deep hyperspectral prior (DHP) and dual-attention residual network (DARN) is proposed in this article. Specifically, we first upsample the LR-HSI to the scale of the panchromatic (PAN) image through the DHP algorithm, which can better preserve spatial and spectral information without learning from large data sets. The upsampled result is then concatenated with the PAN image to form the input of the DARN, where several channel-spatial attention residual blocks (CSA ResBlocks) are stacked to map the residual HSI between the reference HSI and the upsampled HSI. In each CSA ResBlock, two complementary attention modules, i.e., channel attention and spatial attention modules, are designed to adaptively learn more informative features of spectral channels and spatial locations simultaneously, which can effectively boost the fusion accuracy. Finally, the fused HSI is obtained by the summation of the upsampled HSI and the reconstructed residual HSI. The experimental results of both simulated and real HS data sets demonstrate that the performance of our DHP-DARN method is superior over the state-of-the-art HS pansharpening approaches.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.