Abstract

Purpose The accuracy and reliability of upper limb motion assessment have received great attention in the field of rehabilitation. Grasping test is widely carried out for motion assessment, which requires patients to grasp objects and move them to target place. The traditional assessments test the upper limb motion ability by therapists, which mainly relies on experience and lacks quantitative indicators. This paper aims to propose a deep learning method based on the vision system of our upper limb rehabilitation robot to recognize the motion trajectory of rehabilitation target objects automatically and quantitatively assess the upper limb motion in the grasping test. Design/methodology/approach To begin with, an SRF network is designed to recognize rehabilitation target objects grasped in assessment tests. Moreover, the upper limb motion trajectory is calculated through the motion of objects’ central positions. After that, a GAE network is designed to analyze the motion trajectory which reflects the motion of upper limb. Finally, based on the upper limb rehabilitation exoskeleton platform, the upper limb motion assessment tests are carried out to show the accuracy of both object recognition of SRF network and motion assessment of GAE network. The results including object recognition, trajectory calculation and deviation assessment are given with details. Findings The performance of the proposed networks is validated by experiments that are developed on the upper limb rehabilitation robot. It is implemented by recognizing rehabilitation target objects, calculating the motion trajectory and grading the upper limb motion performance. It illustrates that the networks, including both object recognition and trajectory evaluation, can grade the upper limb motion functionn accurately, where the accuracy is above 95.0% in different grasping tests. Originality/value A novel assessment method of upper limb motion is proposed and verified. According to the experimental results, the accuracy can be remarkably enhanced, and the stability of the results can be improved, which provide more quantitative indicators for further application of upper limb motion assessment.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.