Abstract

Robust neural decoding of intended motor output is crucial to enable intuitive control of assistive devices, such as robotic hands, to perform daily tasks. Few existing neural decoders can predict kinetic and kinematic variables simultaneously. The current study developed a continuous neural decoding approach that can concurrently predict fingertip forces and joint angles of multiple fingers. We obtained motoneuron firing activities by decomposing high-density electromyogram (HD EMG) signals of the extrinsic finger muscles. The identified motoneurons were first grouped and then refined specific to each finger (index or middle) and task (finger force and dynamic movement) combination. The refined motoneuron groups (separate matrix) were then applied directly to new EMG data in real-time involving both finger force and dynamic movement tasks produced by both fingers. EMG-amplitude-based prediction was also performed as a comparison. We found that the newly developed decoding approach outperformed the EMG-amplitude method for both finger force and joint angle estimations with a lower prediction error (Force: 3.47±0.43 vs 6.64±0.69% MVC, Joint Angle: 5.40±0.50° vs 12.8±0.65°) and a higher correlation (Force: 0.75±0.02 vs 0.66±0.05, Joint Angle: 0.94±0.01 vs 0.5±0.05) between the estimated and recorded motor output. The performance was also consistent for both fingers. The developed neural decoding algorithm allowed us to accurately and concurrently predict finger forces and joint angles of multiple fingers in real-time. Our approach can enable intuitive interactions with assistive robotic hands, and allow the performance of dexterous hand skills involving both force control tasks and dynamic movement control tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call