Abstract

Human gait has been regarded as a useful behavioral biometric trait for personal identification and authentication. This study aimed to propose an effective approach for classifying gait, measured using wearable inertial sensors, based on neural networks. The 3-axis accelerometer and 3-axis gyroscope data were acquired at the posterior pelvis, both thighs, both shanks, and both feet while 29 semi-professional athletes, 19 participants with normal foot, and 21 patients with foot deformities walked on the 20-meter straight path. The classifier based on the gait parameters and fully connected neural network was developed by applying 4-fold cross-validation to 80% of the total samples. For the test set that consisted of the remaining 20% of the total samples, this classifier showed an accuracy of 93.02% in categorizing the athlete, normal foot, and deformed foot groups. Using the same model validation and evaluation method, up to 98.19% accuracy was achieved from the convolutional neural network-based classifier. This classifier was trained with the gait spectrograms obtained from the time-frequency domain analysis of the raw acceleration and angular velocity data. The classification based only on the pelvic spectrograms exhibited an accuracy of 94.25% even without requiring a time-consuming and resource-intensive process for feature engineering. The notable performance and practicality in gait classification achieved by this study suggest potential applicability of the proposed approaches in the field of biometrics.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.