Abstract

Recent research on single image super-resolution (SISR) has achieved great success due to the development of deep convolutional neural networks. However, most existing SISR methods merely focus on super-resolution of a single fixed integer scale factor. This simplified assumption does not meet the complex conditions for real-world images which often suffer from various blur kernels or various levels of noise. More importantly, previous methods lack the ability to cope with arbitrary degradation parameters (scale factors, blur kernels, and noise levels) with a single model. A few methods can handle multiple degradation factors, e.g., noninteger scale factors, blurring, and noise, simultaneously within a single SISR model. In this work, we propose a simple yet powerful method termed meta-USR which is the first unified super-resolution network for arbitrary degradation parameters with meta-learning. In Meta-USR, a meta-restoration module (MRM) is proposed to enhance the traditional upscale module with the capability to adaptively predict the weights of the convolution filters for various combinations of degradation parameters. Thus, the MRM can not only upscale the feature maps with arbitrary scale factors but also restore the SR image with different blur kernels and noise levels. Moreover, the lightweight MRM can be placed at the end of the network, which makes it very efficient for iteratively/repeatedly searching the various degradation factors. We evaluate the proposed method through extensive experiments on several widely used benchmark data sets on SISR. The qualitative and quantitative experimental results show the superiority of our Meta-USR.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.