Abstract
Traditional approaches mainly fuse a hyperspectral image (HSI) with a high-resolution multispectral image (MSI) to improve the spatial resolution of the HSI. However, such improvement in the spatial resolution of HSIs is still limited because the spatial resolution of MSIs remains low. To further improve the spatial resolution of HSIs, we propose HyperNet, a deep network for the fusion of HSI, MSI, and panchromatic image (PAN), which effectively injects the spatial details of an MSI and a PAN into an HSI while preserving the spectral information of the HSI. Thus, we design HyperNet on the basis of a uniform fusion strategy to solve the problem of complex fusion of three types of sources (i.e., HSI, MSI, and PAN). In particular, the spatial details of the MSI and the PAN are extracted by multiple specially designed multiscale-attention-enhance blocks in which multi-scale convolution is used to adaptively extract features from different reception fields, and two attention mechanisms are adopted to enhance the representation capability of features along the spectral and spatial dimensions, respectively. Through the capability of feature reuse and interaction in a specially designed dense-detail-insertion block, the previously extracted features are subsequently injected into the HSI according to the unidirectional feature propagation among the layers of dense connection. Finally, we construct an efficient loss function by integrating the multi-scale structural similarity index with the L1 norm, which drives HyperNet to generate high-quality results with a good balance between spatial and spectral qualities. Extensive experiments on simulated and real data sets qualitatively and quantitatively demonstrate the superiority of HyperNet over other state-of-the-art methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: ISPRS Journal of Photogrammetry and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.