Abstract

Non-verbal communication involves the usage of Sign Language. The sign language is used by people with hearing / speech disabilities to express their thoughts and feelings. But normally, people find it difficult to understand the hand gestures of the specially challenged people as they do not know the meaning of the sign language gestures. Usually, a translator is needed when a speech / hearing impaired person wants to communicate with an ordinary person and vice versa. In order to enable the specially challenged people to effectively communicate with the people around them, a system that translates the Indian Sign Language (ISL) hand gestures of numbers (1-9), English alphabets (A-Z) and a few English words to understandable text and vice versa has been proposed in this paper. This is done using image processing techniques and Machine Learning algorithms. Different neural network classifiers are developed, tested and validated for their performance in gesture recognition and the most efficient classifier is identified.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call