Abstract

This paper presents an augmented reality application to assist with myoelectric prostheses control for people with limb amputations. For that, we use the low-cost Myo armband coupled with low-level signal processing methods specifically built to control filters’ levels and processing chain. In particular, we use deep learning techniques to process the signals and to accurately identify seven different hand gestures. From that, we have built an augmented reality projection of a hand based on AprilTag markers that displays the gesture identified by the deep learning techniques. With the aim to properly train the gesture recognition system, we have built our own dataset with nine subjects. This dataset was combined with one publicly available to work with the data of 24 subjects in total. Finally, three different deep learning architectures have been comparatively studied, achieving high accuracy values (being 95.56% the best one). This validates our hypothesis that it is possible to have an adaptive platform able to fast learn personalized hand/arm gestures while projecting a virtual hand in real-time. This can reduce the adaptation time to myoelectric prostheses and improve the acceptance levels.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call