Abstract

Abstract: Sign language is an essential means of communication for deaf and hard-of-hearing individuals. However, unlike spoken languages which have a universal language, every country has its own native sign language. In India, the Indian Sign Language (ISL) is used. This survey aims to provide an overview of the recognition and translation of essential Indian sign language. While significant research has been conducted in American Sign Language (ASL), the same cannot be said for Indian Sign Language due to its unique characteristics. The proposed method focuses on designing a tool for translating ISL hand gestures to help the deaf-mute community convey their ideas. A self-created ISL dataset was used to train the model for gesture recognition. The literature contains a plethora of methods for extracting features and classifying sign language, with a majority of them utilizing machine learning techniques. However, this article proposes the adoption of a deep learning method by designing a Convolution Neural Network (CNN) model for the purpose of extracting sign language features and recognizing them accurately. This CNN model is specifically designed to identify complex patterns in the data and use them to efficiently recognize sign language features. By adopting this approach, it is expected that the recognition of sign language will improve significantly, providing a more effective means of communication for the deaf and hard-of-hearing community.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call