Abstract

User authentication has traditionally been performed using methods such as passwords or fingerprints. However, passwords have security vulnerabilities, and fingerprints may hinder user convenience. To address these issues, a novel user authentication method based on biosignals, specifically electromyogram (EMG) signals, is proposed. Using biosignals like EMG offers several advantages, including the ability to acquire data without user awareness, independence from the user’s environment, rapid acquisition, and enhanced security. However, one challenge with using EMG signals for authentication has been their relatively low accuracy. In this paper, a neural network is implemented using a small number of parameters (fewer than 7000) to produce a wearable device using biosignals, and user authentication accuracy is secured using the maximal overlap discrete wavelet transform (MODWT) method and the Siamese network. The MODWT method is highly effective for the time and frequency analysis of time series data, and the Siamese network is a representative method for few-shot learning. The proposed neural network is verified using Chosun University’s user authentication dataset, encompassing data from 100 individuals. Finally, this proposed network is implemented on an edge device such as field-programmable gate arrays (FPGAs) so that it can be applied to a wearable user authentication system. By implementing the Siamese network in FPGA-based edge devices, it was possible to secure user authentication performance at 94% accuracy and an authentication speed within 1.5 ms. In the case of accuracy, it is expected to be further improved by using the multimodal technique of biosignals. Also, the proposed system can be easily fabricated for digital integrated chips (ICs).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.