Abstract

Purpose: Sign language is a communication media for hearing impaired people. Although sign language can be effectively used for communication between two disabled persons, the challenge is higher when they need to communicate with the outside world which rarely has the understanding toward sign language. The aim of this work is to develop computer vision based Indian Sign Language (ISL) fingerspelling identification system which will help in recognizing alphabetic signs as well as emergency words in the sign language without any additional equipment. It will thus improve the existing communication gap between the disabled and abled people. Methodology: We surveyed the literature using Prisma methodology. The study shows that deep learning techniques like convolutional neural network (CNN) can promise better accuracy than any other neural network architecture with feature extraction. We have also used data aggregation between two data sets and prepared one dataset of around 50 K images. Our idea is to implement a CNN which can be employed for real-time applications to detect hand gestures. The input of this system can be then forwarded to long short-term memory (LSTM) model (sequence to sequence learning algorithm) which will encode the gesture and then decode it into text format. Findings: Many research works have been proposed on American Sign Language (ASL) recognition. The method used for ASL cannot be used for ISL, as it differs from Indian Sign Language (ISL) recognition. As in ISL, two hands are used for communicating, whereas ASL uses one hand for communication. This will lead to overlapping of hands making ISL unclear and difficult to understand the features. The major problem of all related research works was lack of available dataset. There are very few works done on creation of such datasets. Values: A system which will help in automatic recognition of sign language and will help people with hearing impaired to communicate with the real world. Originality: Similar systems have been developed on only alphabet recognition with sign language. We also aim to leverage this dataset by augmenting with frequently used emergency words and gestures that can help hearing impaired people to convey important messages in emergency as well as routine activities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call