Abstract

The use of hand gestures is one of the commonly used communication approaches in human daily life, especially for the deaf and dumb. Hand gesture recognition can be adopted in human-computer interaction for converting hand gestures into words or sentences. Unfortunately, the same gesture may have diverse meanings in different countries. With the aim of eliminating the communication barriers between hearing-impaired communities and the general people, an efficient interaction user interface created with the augmented reality technique and leap motion controller for hand gesture recognition and translation is proposed in this paper. Five hand gestures captured by a leap motion controller were used for learning and recognizing through machine learning methodologies, including Support Vector Machine, K-Nearest Neighbor, Convolutional Neural Network, Deep Neural Network and Decision Tree. The experimental results from different classifiers reveal the practicability of employing hand gesture recognition in text translation. The hand gesture recognition system should be capable of reducing the communication gap between hearing disabilities and the public so as to avoid deaf and mute people being isolated from society.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.