Abstract

In recent years, constructing various deep convolutional neural networks (CNNs) for single-image super-resolution (SISR) tasks has made significant progress. Despite their high performance, numerous CNNs are limited in practical applications, owing to the requirement of heavy computation. This paper proposes a lightweight network for SISR, known as attention-based multi-scale residual network (AMSRN). In detail, a residual atrous spatial pyramid pooling (ASPP) block as well as a spatial and channel-wise attention residual (SCAR) block is stacked alternately to support the main framework of the entire network. The residual ASPP block utilizes parallel dilated convolutions of different dilation rates to achieve the purpose of capturing multi-scale features. The SCAR block adds the channel attention (CA) and spatial attention (SA) mechanisms based on a double-layer convolution residual block. In addition, group convolution is introduced in the SCAR block to further reduce the parameters while preventing over-fitting. Moreover, a multi-scale feature attention module is designed to provide instructive multi-scale attention information for shallow features. Particularly, we propose a novel upscale module, which adopts dual paths to upscale the features by jointly using sub-pixel convolution and nearest interpolation layers, instead of using deconvolution layer or sub-pixel convolution layer alone. The experimental results demonstrate that our method achieves comparable performance to the state-of-the-art methods, both quantitatively and qualitatively.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.