Abstract

The ability of a novel biorealistic hand prosthesis for grasp force control reveals improved neural compatibility between the human-prosthetic interaction. The primary purpose here was to validate a virtual training platform for amputee subjects and evaluate the respective roles of visual and tactile information in fundamental force control tasks. We developed a digital twin of tendon-driven prosthetic hand in the MuJoCo environment. Biorealistic controllers emulated a pair of antagonistic muscles controlling the index finger of the virtual hand by surface electromyographic (sEMG) signals from amputees' residual forearm muscles. Grasp force information was transmitted to amputees through evoked tactile sensation (ETS) feedback. Six forearm amputees participated in force tracking and holding tasks under different feedback conditions or using their intact hands. Test results showed that visual feedback played a predominant role than ETS feedback in force tracking and holding tasks. However, in the absence of visual feedback during the force holding task, ETS feedback significantly enhanced motor performance compared to feedforward control alone. Thus, ETS feedback still supplied reliable sensory information to facilitate amputee's ability of stable grasp force control. The effects of tactile and visual feedback on force control were subject-specific when both types of feedback were provided simultaneously. Amputees were able to integrate visual and tactile information to the biorealistic controllers and achieve a good sensorimotor performance in grasp force regulation. The virtual platform may provide a training paradigm for amputees to adapt the biorealistic hand controller and ETS feedback optimally.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call