Abstract

Palm-vein recognition has been the focus of large research efforts over the last years. However, despite the effectiveness of deep learning models, in particular Convolutional Neural Networks (CNNs), in automatically learning robust feature representations, thereby obtaining good accuracy, such good performance is usually obtained at the expense of annotating a large training dataset. Labeling vein images, however, is an expensive and tedious process. Although handcrafted schemes for data augmentation usually increase slightly performance, they are unable to cover complex variations inherently characterizing such images. To overcome this issue, we propose a new unsupervised domain adaptation model, called CycleGAN-based domain adaptation (CGAN-DA), that extracts discriminant representation from the palmvein images, without requiring any image labeling. Our CGAN-DA models allows a conjoint adaptation, at the image and feature levels. Specifically, in order to enhance the extracted features’ domain-invariance, image appearance is transformed across two domains, palm-vein domain and retinal domain. We employ several adversarial losses namely a segmentation loss and a cycle consistence loss to train our model without any annotation from the target domain (palm-vein images). Our experiments on the public CASIA palm-vein dataset demonstrates that our models significantly outperforms the s tart of the art in terms of verification accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.