Abstract

The gait of a subject follows a specific pattern, but variations exist that are unique to a subject but contrasting to other subjects. This can be utilized for biometric authentication to prevent impersonation during gait studies. However, due to the dynamic nature of gait, like changes in gait speed while walking, gait biometric authentications are challenging. In the state-of-the-art, although attempts have been made to use deep learning and other signal processing methods for biometric authentication, which obtained reliable results, these are either highly resource-consuming, require several sensors or need an expensive framework, making it challenging to implement this in many scenarios. Therefore, a knowledge gap exists to build a reliable, inexpensive and resource-efficient gait biometric authentication system. The paper proposes a method for using only one embedded IMU sensor with a microcontroller for tracking the motion of a subject, resource-efficient on-device elimination of the gait speed differences by proposing a homologous time approximation warping algorithm and building a resource-efficient TinyML model for reliable biometric authentication. Based on an experiment consisting of 20 human subjects with consent, the microcontroller’s on-device accuracy score for decision-making by TinyML was found to be 0.9276. The resource efficiency of the model based on memory profiling has been further discussed. Also, the prediction performance of the microcontroller with the proposed optimization was found to be only 8% slower compared to a personal computer, given that several thousands of processes run parallel on a personal computer. The work needs to be further tested for a larger sample space, and data privacy needs to be addressed.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.