Abstract

Sign language is one of the most reliable ways of communicating with special needs people, as it can be done anywhere. However, most people do not understand sign language. Therefore, we have devised an idea to make a desktop application that can recognize sign language and convert it to text in real time. This research uses American Sign Language (ASL) datasets and the Convolutional Neural Networks (CNN) classification system. In the classification, the hand image is first passed through a filter and after the filter is applied, the hand is passed through a classifier which predicts the class of the hand gestures. This research focuses on the accuracy of the recognition. Our Application resulted in 96,3% accuracy for the 26 letters of the alphabet.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call