Abstract

Hyperspectral imaging is unable to acquire images with high resolution in both spatial and spectral dimensions yet, due to physical hardware limitations. It can only produce low spatial resolution images in most cases and thus hyperspectral image (HSI) spatial super-resolution is important. Recently, deep learning-based methods for HSI spatial super-resolution have been actively exploited. However, existing methods do not focus on structural spatial-spectral correlation and global correlation along spectra, which cannot fully exploit useful information for super-resolution. Also, some of the methods are straightforward extension of RGB super-resolution methods, which have fixed number of spectral channels and cannot be generally applied to hyperspectral images whose number of channels varies. Furthermore, unlike RGB images, existing HSI datasets are small and limit the performance of learning-based methods. In this article, we design a bidirectional 3D quasi-recurrent neural network for HSI super-resolution with arbitrary number of bands. Specifically, we introduce a core unit that contains a 3D convolutional module and a bidirectional quasi-recurrent pooling module to effectively extract structural spatial-spectral correlation and global correlation along spectra, respectively. By combining domain knowledge of HSI with a novel pretraining strategy, our method can be well generalized to remote sensing HSI datasets with limited number of training data. Extensive evaluations and comparisons on HSI super-resolution demonstrate improvements over state-of-the-art methods, in terms of both restoration accuracy and visual quality.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.