Abstract
Inertial Measurement Units (IMUs)-based gait analysis is a promising and attractive approach for user recognition. Recently, the adoption of deep learning techniques has gained significant performance improvement. However, most existing studies focused on exploiting the spatial information of gait data (using Convolutional Neural Network (CNN)) while the temporal part received little attention. In this study, we propose a new multi-model Long Short-term Memory (LSTM) network for learning the gait temporal features. First, we observe that LSTM is able to capture the pattern hidden inside the gait data sequences that are out-of-synchronization. Thus, instead of using the gait cycle-based segment, our model accepts the gait cycle-free segment (i.e., fixed-length window) as the input. By this, the classification task does not depend on the gait cycle detection task, which usually suffers from noise and bias. Second, we propose a new LSTM network architecture, in which, one LSTM is used for each gait data channel and a group of consecutive signals is processed in each step. This strategy allows the network to effectively handle the long input data sequence and achieve improved performance compared to existing LSTM-based gait models. In addition, besides using the LSTM alone, we extend it by combining with a CNN model to construct a hybrid network, which further improves the recognition performance. We evaluated our LSTM and hybrid networks under different settings using the whuGAIT and OU-ISIR datasets. The experiments showed that our LSTM network outperformed the existing LSTM networks, and its combination with CNN established new state-of-the-art performance on both the verification and identification tasks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.