Abstract
Remote sensing image super-resolution (SR) refers to a technique improving the spatial resolution, which in turn benefits to the subsequent image interpretation, e.g., target recognition, classification, and change detection. In popular sparse representation-based methods, due to the complex imaging conditions and unknown degradation process, the sparse coefficients of low-resolution (LR) observed images are hardly consistent with the real high-resolution (HR) counterparts, which leads to unsatisfactory SR results. To address this problem, a novel coupled sparse autoencoder (CSAE) is proposed in this paper to effectively learn the mapping relation between the LR and HR images. Specifically, the LR and HR images are first represented by a set of sparse coefficients, and then, a CSAE is established to learn the mapping relation between them. Since the proposed method leverages the feature representation ability of both sparse decomposition and CSAE, the mapping relation between the LR and HR images can be accurately obtained. Experimentally, the proposed method is compared with several state-of-the-art image SR methods on three real-world remote sensing image datasets with different spatial resolutions. The extensive experimental results demonstrate that the proposed method has gained solid improvements in terms of average peak signal-to-noise ratio and structural similarity measurement on all of the three datasets. Moreover, results also show that with larger upscaling factors, the proposed method achieves more prominent performance than the other competitive methods.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.