Abstract
In the convolutional sparse coding-based image super-resolution problem, the coefficients of low- and high-resolution images in the same position are assumed to be equivalent, which enforces an identical structure of low- and high-resolution images. However, in fact the structure of high-resolution images is much more complicated than that of low-resolution images. In order to reduce the coupling between low- and high-resolution representations, a semi-coupled convolutional sparse learning method (SCCSL) is proposed for image super-resolution. The proposed method uses nonlinear convolution operations as the mapping function between low- and high-resolution features, and conventional linear mapping can be seen as a special case of the proposed method. Secondly, the neighborhoods within the filter size are used to calculate the current pixel, improving the flexibility of our proposed model. In addition, the filter size is adjustable. In order to illustrate the effectiveness of SCCSL method, we compare it with four state-of-the-art methods of 15 commonly used images. Experimental results show that this work provides a more flexible and efficient approach for image super-resolution problem.
Highlights
Conventional sparse coding [1,2,3] (SC) formulates a signal by a linear combination of a few atoms in a redundant dictionary
These 2D filters form a dictionary of convolutional sparse coding (CSC), and the feature maps which can be seen as sparse coefficients reflect the activation positions of the filters
As described in Equation (1), the convolutions of CSC are implemented on the filters and the feature maps, the filters can be seen as a dictionary and the feature maps are the codings of the image
Summary
Conventional sparse coding [1,2,3] (SC) formulates a signal by a linear combination of a few atoms in a redundant dictionary. These 2D filters form a dictionary of CSC, and the feature maps which can be seen as sparse coefficients reflect the activation positions of the filters.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.