Abstract
BackgroundMachine learning models were satisfactorily implemented for estimating gait events from surface electromyographic (sEMG) signals during walking. Most of them are based on inter-subject approaches for data preparation. Aim of the study is to propose an intra-subject approach for binary classifying gait phases and predicting gait events based on neural network interpretation of sEMG signals and to test the hypothesis that the intra-subject approach is able to achieve better performances compared to an inter-subject one. To this aim, sEMG signals were acquired from 10 leg muscles in about 10.000 strides from 23 healthy adults, during ground walking, and a multi-layer perceptron (MLP) architecture was implemented.ResultsClassification/prediction accuracy was tested vs. the ground truth, represented by the foot–floor-contact signal provided by three foot-switches, through samples not used during training phase. Average classification accuracy of 96.1 ± 1.9% and mean absolute value (MAE) of 14.4 ± 4.7 ms and 23.7 ± 11.3 ms in predicting heel-strike (HS) and toe-off (TO) timing were provided. Performances of the proposed approach were tested by a direct comparison with performances provided by the inter-subject approach in the same population. Comparison results showed 1.4% improvement of mean classification accuracy and a significant (p < 0.05) decrease of MAE in predicting HS and TO timing (23% and 33% reduction, respectively).ConclusionsThe study developed an accurate methodology for classification and prediction of gait events, based on neural network interpretation of intra-subject sEMG data, able to outperform more typical inter-subject approaches. The clinically useful contribution consists in predicting gait events from only EMG signals from a single subject, contributing to remove the need of further sensors for the direct measurement of temporal data.
Highlights
Machine learning models were satisfactorily implemented for estimating gait events from surface electromyographic signals during walking
It was applied to 10 surface electromyographic (sEMG) signals acquired from 23 adults
The present study proposes an accurate methodology for classifying stance vs. swing and predicting the timing of gait events, based on neural network interpretation of intrasubject sEMG data during walking
Summary
Machine learning models were satisfactorily implemented for estimating gait events from surface electromyographic (sEMG) signals during walking. Di Nardo et al BioMed Eng OnLine (2020) 19:58 measurement units, IMUs) and robust artificial intelligence methods for handling large amount of data and signals (machine and deep learning) This kind of innovation is starting to allow a reduction of the complexity of experimental protocols for gait analysis and a cheaper, less-invasive, and more comfortable assessment of gait data. It has been reported that EMGbased approach seems to be preferable over different approaches (including IMU), for specific environments such as control of exoskeleton devices [2], where EMG signals could be used to identify segment motion in advance, limiting delays in control action An example of this is reported by Wentink et al [3] who showed that the analysis of EMG signals allows assessing gait initiation earlier (63–138 ms) than inertial sensors, in a population of transfemoral amputees. The possibility of assessing HS and TO timing directly from sEMG signal, without using additional sensors such as foot-switch and IMUs, seems to be very useful in those cases
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.