Abstract
Gait phase contains rich kinematic information of lower limbs, which has great reference significance for rehabilitation medicine, assistive design and identity recognition, etc. This research presents a wearable gait phase segmentation method based on lower limb motion capture technique. In our method, a body sensor network covering the whole lower limbs was established to capture the motion data of human gait, and a 3-D lower limb dynamic model is created to reconstruct lower limb movements through multi-sensor data fusion. Six gait events are labeled by the lower limb dynamic model. Then, a deep classification network combining (temporal convolutional network) TCN and long-short-term memory (LSTM) is proposed to segment the six gait phases as pattern classification. In addition, different sensor combinations for gait phase segmentation were also evaluated to select optimal sensor layouts. Detection performance is evaluated using metrics of accuracy, specificity, recall and F1 score, and the averaged performance values are 98.9%, 98.9%, 98.8% and 98.9%, respectively. The overall experimental results demonstrate that our proposed method can well address the issue of gait phase segmentation and provide spatial-temporal parameters for further gait analysis.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Instrumentation and Measurement
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.