Abstract

Hand gestures will become a mainstream method of manipulating human computer interfaces (HCIs). For disabled people with mobility problems, hand gesture-based HCIs should be specifically designed. To achieve effective hand gesture HCIs, this study integrated a mobile service robot platform, three-dimensional (3D) imaging sensors, and wearable Myo armband device. Four kernel techniques are presented: (1) Myo armband software development kit hand gesture recognition using a two-layer hierarchy scheme to significantly increase hand gesture command numbers, (2) identity recognition of users using clustering-based support vector machine classifiers with a designed root mean square surface electromyography (RMS-sEMG) feature, (3) robot vehicle navigation with effective obstacle avoidance using a conceptually simple and computationally fast approach, and (4) efficient vehicle positioning based on the face-detection information of the user provided from the 3D imaging sensor to receive the hand gestures commands of the user with disabilities.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call