7-days of FREE Audio papers, translation & more with Prime
7-days of FREE Prime access
7-days of FREE Audio papers, translation & more with Prime
7-days of FREE Prime access
https://doi.org/10.1016/j.ins.2016.05.032
Copy DOIJournal: Information Sciences | Publication Date: Jun 7, 2016 |
Citations: 65 |
In human-machine interaction, human face is one of the core factors. However, due to the limitations of imaging conditions and low-cost imaging sensors, the captured faces are often low-resolution (LR). This will seriously degrade the performance of face detection, expression analysis, and face recognition, which are the basic problems in human-machine interaction applications. Face super-resolution (SR) is the technology of inducing a high-resolution (HR) face from the observed LR one. It has been a hot topic of wide concern recently. In this paper, we present a novel face SR method based on Tikhonov regularized neighbor representation (TRNR). It can overcome the technological bottlenecks (e.g., instable solutionand noise sensitive) of the patch representationscheme in traditional neighbor embedding based image SR methods. Specifically, we introduce the Tikhonov regularization term to regularize the representation of the observation LR patches, leading to a unique and stable solution for the least squares problem. Furthermore, we show a connection of the proposed model to the neighbor embedding model, least squares representation,sparse representation, and locality-constrained representation. Extensive experiments on face SR are carried out to validate the generality, effectiveness, and robustness of the proposed algorithm. Experimental results on the public FEI face database and surveillance images show that the proposed method achieves better performance in terms of reconstruction error and visual quality than existing state-of-the-art methods.
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.