Abstract

One of the major challenges that deaf people face in modern societal life is communication. For those engaged in agricultural jobs, efficiency at work and productivity are deeply related to the quality of deciphering the sign language used by the deaf farmers. Employing sign language interpreters is not a pragmatic solution to this problem. There comes the need for developing a reliable system for automatic sign language recognition (SLR). This paper reports a work on the recognition of hand gestures for the Indian sign language (ISL) words commonly used by deaf farmers. A hybrid deep learning model with convolutional long short term memory (LSTM) network has been exploited for gesture classification. The model has attained an average classification accuracy of 76.21% on the proposed dataset of ISL words from the agricultural domain.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call