Abstract

Wearable-sensor gait signals processed using advanced machine learning algorithms are shown to be reliable for user authentication. However, no study has been reported to investigate the influence of elapsed time on wearable sensor-based gait authentication performance. This work is the first exploratory study that presents accelerometer and gyroscope signals from 144 participants with slow, normal, and fast walking speeds from 2 sessions (1-month elapse time) to evaluate IMU gait-based authentication performance. Gait signals are recorded in six positions (i.e., left and right pocket, left and right hand, handbag, and backpack). The users' identities are verified using a robust gait authentication method called Adaptive 1-Dimensional Time Invariant Learning (A1TIL). In A1TIL, 1D Local Ternary Patterns (LTP) with an adaptive threshold is proposed to extract discriminative time-invariant features from a gait cycle. In addition, a new unsupervised learning method called Kernelized Domain Adaptation (KDA) is applied to match two gait signals from different time spans for user verification. Comprehensive experiments have been conducted to assess the effectiveness of the proposed approach on a newly developed time invariant inertial sensor dataset. The promising result with an Equal Error Rate (EER) of 4.38% from slow walking speed and right pocket position across 1 month demonstrates that gait signals extracted from inertial sensors can be used as a reliable means of biometrics across time.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.