Abstract
Real-world degradations deviate from ideal degradations, as most deep learning-based scenarios involve the ideal synthesis of low-resolution (LR) counterpart images by popularly used bicubic interpolation. Moreover, supervised learning approaches rely on many high-resolution (HR) and LR image pairings to reconstruct missing information based on their association, developed by complex long hours of deep neural network training. Additionally, the trained model's generalizability on various image datasets with various distributions is not guaranteed. To overcome this challenge, we proposed our novel Self-FuseNet, particularly for extremely poor-resolution satellite images. Also, the network exhibits strong generalization performance on additional datasets (both “ideal” and “nonideal” scenarios). The network is especially for those image datasets suffering from the following two significant limitations: 1) nonavailability of ground truth HR images; 2) limitation of a large count of the unpaired dataset for deep neural network training. The benefit of the proposed model is threefold: 1) it does not require any significant extensive training data, either paired or unpaired but only a single LR image without prior knowledge of its distribution; 2) it is a simple and effective model for super-resolving very poor-resolution images, saving computational resources and time; 3) using UNet, the processing of data are accelerated by the network's wide skip connections, allowing image reconstruction with fewer parameters. Rather than using an inverse approach, as common in most deep learning scenarios, we introduced a forward approach to super-resolve exceptionally LR remote sensing images. This demonstrates its supremacy over recently proposed state-of-the-art methods for unsupervised single real-world image blind super-resolution.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.