Abstract

A skip-connection learning framework-based convolution neural network (CNN) has recently achieved great success in image super-resolution (SR). However, most CNN models based on the skip-connection learning framework do not fully make use of potential multi-scale features of images. In this paper, we propose a multi-scale skip-connection network (MSN) to improve the visual quality of the image SR. First, convolution kernels with different sizes are exploited to capture the multi-scale features of LR images. All the feature-maps captured by convolution kernels of the same size are direct input into a multi-scale hybrid group (MHG); second, the convolution layers of each MHG are composed of dilated convolutions and standard convolutions. The hybrid convolutions can fully train feature details obtained from preceding and current scale convolution layers; three, the output of each hybrid convolution layer is fed into subsequent hybrid convolution layers by skip-connections, thus producing dense connections; lastly, the meta-upscale module is used as the upscale module, which can magnify the trained feature maps arbitrary scale factors. By being evaluated on a wide variety of images, the proposed MSN network achieves an advantage over the state-of-the-art methods in terms of both numerical results and visual quality.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.