Abstract

Human joint moment plays an important role in quantitative rehabilitation assessment and exoskeleton robot control. However, the existing moment prediction methods require kinematic and kinetic data of human body as input, and the measurement of them needs special equipment, which makes them unable to be used in an unconstrained environment. According to the situation, this paper develops a novel method where a small number of input variables selected by Elastic Net are used as the input of artificial neural network (ANN) to predict joint moments, which makes the prediction in daily life possible. The method is tested on the experimental data collected from eight healthy subjects that are running on a treadmill at a speed of 2, 3, 4, and 5 m/s, respectively. Taking the right lower limb’s 10 electromyography (EMG) and 5 joints angle data as candidate variable sets, Elastic Net is used to obtain the variable coefficients of the right lower limb’s four joint moments. The inputs of the ANN determined by the variable coefficients are used to train and predict the joint moments. Prediction accuracy is evaluated by using the normalized root-mean-square error (NRMSE %) and cross correlation coefficient ( $\rho$ ) between the predicted joint moment and multi-body dynamics moment. Results of our study suggest that the method can accurately predict joint moment (NRMSE $\rho >0.9633$ ) with only 5–6 EMG signals. In conclusion, this method can effectively reduce the input variables while keeping a certain precision, which makes the joint moment prediction simple and out of equipment limitation. This method may facilitate the researches on real-time gait analysis and exoskeleton robot control in motor rehabilitation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.