Abstract

Dexterous control of upper limb prostheses with multiarticulated wrists/hands is still a challenge due to the limitations of myoelectric man-machine interfaces. Multiple factors limit the overall performance and usability of these interfaces, such as the need to control degrees of freedom sequentially and not concurrently, and the inaccuracies in decoding the user intent from weak or fatigued muscles. In this article, we developed a novel man-machine interface that endows a myoelectric prosthesis (MYO) with artificial perception, estimation of user intention, and intelligent control (MYO-PACE) to continuously support the user with automation while preparing the prosthesis for grasping. We compared the MYO-PACE against state-of-the-art myoelectric control (pattern recognition) in laboratory and clinical tests. For this purpose, eight able-bodied and two amputee individuals performed a standard clinical test consisting of a series of manipulation tasks (portion of the SHAP test), as well as a more complex sequence of transfer tasks in a cluttered scene. In all tests, the subjects not only completed the trials faster using the MYO-PACE but also achieved more efficient myoelectric control. These results demonstrate that the implementation of advanced perception, context interpretation, and autonomous decision-making into active prostheses improves control dexterity. Moreover, it also effectively supports the user by speeding up the preshaping phase of the movement and decreasing muscle use.

Highlights

  • W HEN we grasp an object, we usually do not think about how we do it

  • All subjects successfully performed the tasks in both myoelectric prosthesis (MYO)–PACE and linear discriminant analysis algorithm (LDA) conditions

  • The total trial time was significantly longer for LDA [63.3(23.4) s] than for MYO–PACE [50.5 (7.2) s], indicating a better overall task performance with MYO–PACE

Read more

Summary

Introduction

W HEN we grasp an object, we usually do not think about how we do it. The type of grasp, tuning of the wrist posture, and movement of the fingers seem to happen automatically. Years of training have led to highly dexterous brain/hand interaction that mainly happens subconsciously and includes the use of a variety of touch and proprioceptive sensors as well as stereovision through our eyes. Modern models have as many as 24 actuated degrees of freedom, allowing individual finger movements [2], and an almost complete range of wrist movements [3]. In a relatively short time, the user must learn how to control these functions with interfaces that currently do not allow large information transfer rates and without feedback from touch or proprioception

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.