Abstract

Recently, single image super-resolution (SISR), aiming to preserve the lost structural and textural information from the input low resolution image, has witnessed huge demand from the videos and graphics industries. The exceptional success of convolution neural networks (CNNs), has absolutely revolutionized the field of SISR. However, for most of the CNN-based SISR methods, excessive memory consumption in terms of parameters and flops, hinders their application in low-computing power devices. Moreover, different state-of-the-art SR methods collect different features, by treating all the pixels contributing equally to the performance of the network. In this paper, we take into consideration both the performance and the reconstruction efficiency, and propose a Light-weight multi-scale attention residual network (MSAR-Net) for SISR. The proposed MSAR-Net consists of stack of multi-scale attention residual (MSAR) blocks for feature refinement, and an up and down-sampling projection (UDP) block for edge refinement of the extracted multi-scale features. These blocks are capable of effectively exploiting the multi-scale edge information, without increasing the number of parameters. Specially, we design our network in progressive fashion, for substituting the large scale factors (× 4) combinations, with small scale factor (×2) combinations, and thus gradually exploit the hierarchical information. In parallel, for modulation of multi-scale features in global and local manners, channel and spatial attention in MSAR block is being used. Visual results and quantitative metrics of PSNR and SSIM exhibit the accuracy of the proposed approach on synthetic benchmark super-resolution datasets. The experimental analysis shows that the proposed approach outperforms the other existing methods for SISR in terms of memory footprint, inference time and visual quality.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.