Abstract

Palmprint direction patterns have been widely and successfully used in palmprint recognition methods. Most existing direction-based methods utilize the pre-defined filters to achieve the genuine line responses in the palmprint image, which requires rich prior knowledge and usually ignores the vital direction information. In addition, some line responses influenced by noise will degrade the recognition accuracy. Furthermore, how to extract the discriminative features to make the palmprint more separable is also a dilemma for improving the recognition performance. To solve these problems, we propose to learn complete and discriminative direction patterns in this study. We first extract the complete and salient local direction patterns, which contains a complete local direction feature (CLDF) and a salient convolution difference feature (SCDF) extracted from the palmprint image. Afterwards, two learning models are proposed to learn sparse and discriminative directions from CLDF and to achieve the underlying structure for the SCDFs in the training samples, respectively. Lastly, the projected CLDF and the projected SCDF are concatenated forming the complete and discriminative direction feature for palmprint recognition. Experimental results on seven palmprint databases, as well as three noisy datasets clearly demonstrates the effectiveness of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call