Abstract

The spectrotemporal information content of surface electromyography has shown strong potential in predicting the intended motor command. During the last decade, with accelerated exploitation of powerful deep-learning techniques aligned with advancements in active prostheses and neurorobots, a great deal of interest has been drawn to the development of intelligent myoelectric prostheses with an ultimate resolution of upper-limb gestures prediction. Recent research involves Deep CNNs, RNNs, and hybrid frameworks, which have shown promising results. However, deep-learning models have almost always been challenged by the structural complexity, the large number of trainable parameters, concerns of overfitting, and prolonged training time, which complicate the practicality and limit the outcomes. In this letter, for the first time, we propose temporal-dilation in the LSTM module of a hybrid Deepnet model for sEMG-based gesture detection, hypothesizing improved accuracy and training agility. We also analyze the effect of dilation-aggressiveness. We conduct systematic and statistical analysis on the efficacy of the proposed approach in comparison to recent literature, including our previous work. this letter shows that the proposed temporally-dilated LSTM model wins over the recent deep-learning techniques in terms of accuracy, and more significantly, it reduces the training time while increasing the convergence speed, with the ultimate goal of maximizing practicality and translational value for neurorobotic systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.