Abstract

As palmprints are captured using non-contact devices, image blur is inevitably generated because of the defocused status. This degrades the recognition performance of the system. To solve this problem, we propose a stable-feature extraction method based on a Vese–Osher (VO) decomposition model to recognize blurred palmprints effectively. A Gaussian defocus degradation model is first established to simulate image blur. With different degrees of blurring, stable features are found to exist in the image which can be investigated by analyzing the blur theoretically. Then, a VO decomposition model is used to obtain structure and texture layers of the blurred palmprint images. The structure layer is stable for different degrees of blurring (this is a theoretical conclusion that needs to be further proved via experiment). Next, an algorithm based on weighted robustness histogram of oriented gradients (WRHOG) is designed to extract the stable features from the structure layer of the blurred palmprint image. Finally, a normalized correlation coefficient is introduced to measure the similarity in the palmprint features. We also designed and performed a series of experiments to show the benefits of the proposed method. The experimental results are used to demonstrate the theoretical conclusion that the structure layer is stable for different blurring scales. The WRHOG method also proves to be an advanced and robust method of distinguishing blurred palmprints. The recognition results obtained using the proposed method and data from two palmprint databases (PolyU and Blurred–PolyU) are stable and superior in comparison to previous high-performance methods (the equal error rate is only 0.132%). In addition, the authentication time is less than 1.3 s, which is fast enough to meet real-time demands. Therefore, the proposed method is a feasible way of implementing blurred palmprint recognition.

Highlights

  • Biometrics attempts to effectively verify the identity of a living person using physiological or behavioral characteristics

  • We proposed a VO–weighted robustness histogram of oriented gradients (WRHOG) method to solve the problem of blurred palmprint recognition

  • The experimental results demonstrate that blurred palmprints can be distinguished quite effectively by combining the use of the structure layer of the blurred image with the WRHOG method

Read more

Summary

Introduction

Biometrics attempts to effectively verify the identity of a living person using physiological or behavioral characteristics. The research presented here, based on a theoretical analysis of the above existing methods, concludes that an effective method for blurred palmprint recognition is to extract stable features from the blurred image (containing principal components and orientation information). A weighted robustness histogram of oriented gradients (WRHOG) method is proposed in order to further extract the stable features from the structure layer of the blurred palmprint image. We obtain the WRHOG for extracting the stable features from the structure layer of a blurred palmprint image via the three steps above. EER values obtained from palmprint images with different blurriness using different methods: (a) HOG and VO–HOG, (b) RHOG and VO–RHOG, and (c) WRHOG and VO–WRHOG. ROC curves for the high-performance and VO–WRHOG methods using data from the blurred PolyU palmprint database. doi:10.1371/journal.pone.0101866.g013

Method
Discussion and Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.