Abstract

Automation of wrist rotations in upper limb prostheses allows simplification of the human-machine interface, reducing the user's mental load and avoiding compensatory movements. This study explored the possibility of predicting wrist rotations in pick-and-place tasks based on kinematic information from the other arm joints. To do this, the position and orientation of the hand, forearm, arm, and back were recorded from five subjects during transport of a cylindrical and a spherical object between four different locations on a vertical shelf. The rotation angles in the arm joints were obtained from the records and used to train feed-forward neural networks (FFNNs) and time-delay neural networks (TDNNs) in order to predict wrist rotations (flexion/extension, abduction/adduction, and pronation/supination) based on the angles at the elbow and shoulder. Correlation coefficients between actual and predicted angles of 0.88 for the FFNN and 0.94 for the TDNN were obtained. These correlations improved when object information was added to the network or when it was trained separately for each object (0.94 for the FFNN, 0.96 for the TDNN). Similarly, it improved when the network was trained specifically for each subject. These results suggest that it would be feasible to reduce compensatory movements in prosthetic hands for specific tasks by using motorized wrists and automating their rotation based on kinematic information obtained with sensors appropriately positioned in the prosthesis and the subject's body.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call