Abstract

The role of feature extraction in electromyogram (EMG) based pattern recognition has recently been emphasized with several publications promoting deep learning (DL) solutions that outperform traditional methods. It has been shown that the ability of DL models to extract temporal, spatial, and spatio–temporal information provides significant enhancements to the performance and generalizability of myoelectric control. Despite these advancements, it can be argued that DL models are computationally very expensive, requiring long training times, increased training data, and high computational resources, yielding solutions that may not yet be feasible for clinical translation given the available technology. The aim of this paper is, therefore, to leverage the benefits of spatio–temporal DL concepts into a computationally feasible and accurate traditional feature extraction method. Specifically, the proposed novel method extracts a set of well-known time-domain features into a matrix representation, convolves them with predetermined fixed filters, and temporally evolves the resulting features over a short and long-term basis to extract the EMG temporal dynamics. The proposed method, based on Fixed Spatio–Temporal Convolutions, offers significant reductions in the computational costs, while demonstrating a solution that can compete with, and even outperform, recent DL models. Experimental tests were performed on sparse-and high-density EMG (HD-EMG) signals databases, across a total of 44 subjects performing a maximum of 53 movements. Despite the simplification compared to deep approaches, our results show that the proposed solution significantly reduces the classification error rates by 3% to 10% in comparison to recent DL models, while being efficient for real-time implementations.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.