Abstract

The smooth and natural interaction between human and lower limb exoskeleton is important. However, one of the challenges is that obtaining the joint rotation angles in time and accurately is difficult. In this paper, we propose the stacked convolutional and long–short term memory networks (Conv-LSTM) to estimate the hip, knee, and ankle joint angles from sEMG signals in locomotion modes including walk, run, stair descent, stair ascent, stand-to-sit, sit-to-stand, and jump. The joint angles are calculated from the kinematic models using the Euler angle signals measured by IMUs. The sEMG and joint angles are segmented according to the gait cycles measured by footswitch signals. Time–frequency analysis of sEMG signals is carried out using continuous wavelet transform. The Conv-LSTM model can extract the spatiotemporal information from the input to establish the mapping from sEMG sequences to multi-joint angle sequences. The evaluation effects of coefficient of determination (R2), root mean squared error, and Dynamic Time Warping on estimation performance are compared. The time domain (TD) features of sEMG perform better on joint angle estimation than the frequency domain and time–frequency domain features (p<0.05). The Conv-LSTM model with TD features as input outperforms the BP and state-of-the-art machine learning algorithms (kernel ridge, random forest, and support vector regression) on multi-joint angle estimation (R2: 0.9334, 0.9110, 0.9236, 0.9238, 0.8999, 0.9430, 0.9351, p<0.05). The estimation results are simulated in V-REP for exoskeleton control.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call