Abstract
Image super-resolution involves reconstructing a blurry, low-resolution image with limited information into a clear, high-resolution image containing more detailed information. The images generated by super-resolution reconstruction can enhance the performance of downstream computer vision tasks, and hold wide application prospects in fields such as industrial fault detection, plant phenotype parameter extraction, medical imaging, and more. High-frequency components in images, such as edges and texture details, typically require more attention. However, when the training samples are limited, effectively recovering clear high-frequency details of images becomes highly challenging. Therefore, this paper proposes a single-image super-resolution method based on generative adversarial networks, named DESRGAN. Compared to existing methods, DESRGAN achieves better reconstruction of image details even with a limited number of training samples. DESRGAN introduces several key innovations: a shallow generator structure to address overfitting issues in small sample scenarios, a dual-stream feature extraction network with dilated convolutions to capture multi-scale contextual information and expand the receptive field, and an artifact loss designed to eliminate artifacts and preserve the true high-frequency details of the super-resolved images. Extensive ablation experiments and comparative studies with multiple state-of-the-art models are conducted on two small sample datasets, "Root" and "Leaves," as well as five publicly available datasets. The results demonstrate that the proposed DESRGAN achieves superior performance in small sample single image super-resolution tasks, with improvements of 1.39 dB in PSNR and 0.013 in SSIM. The generated high-resolution images exhibit clear texture and edge structures, presenting favorable subjective visual effects. Moreover, the model displays strong generalization capabilities.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.