Abstract
Traditional hand-crafted feature models are incapable of overcoming the rotation, scaling, translation and lighting variation influence simultaneously. On the other hand, the pre-processing step, which inevitably destroys the existing feature distribution by denoising and contrast enhancement, is necessary in the traditional recognition framework and often brings in high false rejection rate. CNN, which can recognize patterns with extreme variability, and with robustness to distortions and simple geometric transformations, is adopted to tackle the vein recognition task. Three modifications are made to the original CNN. Firstly, the regularized RBF network is imported to the CNN to realize the recognition task in the last layer. Secondly, self-growth strategy is set to train the feature learning layers across the global model. What’s more, the parameters learning algorithm in the newly-added layer and relearning in the newly published model is designed to obtain the final RCNN-3 model for the generation of discriminative feature representation and state-of-the-art classification results. Rigorous experimental results with the lab-made hand-dorsa vein database by achieving recognition rate of 91.25% in training and 89.43% in testing, and also the comparative experiments with both hand-crafted feature and CNN models fully demonstrates the effectiveness of the proposed model for hand-dorsa vein recognition problem, as well as proves the necessity of importing feature learning model to traditional vein recognition task.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.