Abstract

Recently, as one of the most promising biometric traits, the vein has attracted the attention of both academia and industry because of its living body identification and the convenience of the acquisition process. State-of-the-art techniques can provide relatively good performance, yet they are limited to specific light sources. Besides, it still has poor adaptability to multispectral images. Despite the great success achieved by convolutional neural networks (CNNs) in various image understanding tasks, they often require large training samples and high computation that are infeasible for palm-vein identification. To address this limitation, this work proposes a palm-vein identification system based on lightweight CNN and adaptive multi-spectral method with explainable AI. The principal component analysis on symmetric discrete wavelet transform (SMDWT-PCA) technique for vein images augmentation method is adopted to solve the problem of insufficient data and multispectral adaptability. The depth separable convolution (DSC) has been applied to reduce the number of model parameters in this work. To ensure that the experimental result demonstrates accurately and robustly, a multispectral palm image of the public dataset (CASIA) is also used to assess the performance of the proposed method. As result, the palm-vein identification system can provide superior performance to that of the former related approaches for different spectrums.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.