Abstract

As per Census 2011, in India, there were 26.8 million differently abled people, out of which more than 25%of the people faced difficulty in vocal communication. They use Indian Sign Language (ISL) to communicate with others. The proposed solution is developing a sensor-based Hand Gesture Recognition (HGR) wearable device capable of translating and conveying messages from the vocally challenged community. The proposed method involves designing the hand glove by integrating flex and Inertial Measurement Unit (IMU) sensors within the HGR wearable device, wherein the hand and finger movements are captured as gestures. They are mapped to the ISL dictionary using machine learning techniques that learn the spatio-temporal variations in the gestures for classification. The novelty of the work is to enhance the capacity of HGR by extracting the spatio-temporal variations of the individual’s gestures and adapt it to their dynamics with aging and context factors by proposing Dynamic Spatio-temporal Warping (DSTW) technique along with long short term memory based learning model. Using the sequence of identified gestures along with their ISL mapping, grammatically correct sentences are constructed using transformer-based Natural Language Processing (NLP) models. Later, the sentences are conveyed to the user through a suitable communicable media, such as, text-to-voice, text-image, etc. Implementation of the proposed HGR device along with the Bidirectional Long-Short Memory (BiLSTM) and DSTW techniques is carried out to evaluate the performance with respect to accuracy, precision and reliability for gesture recognition. Experiments were carried out to capture the varied gestures and their recognition, and an accuracy of 98.91% was observed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call