Abstract

There is a segment of society, which does not have access to today's sophisticated acoustics, but gesture-based sign language, such as using the hands or the shoulders of the eyes, can be a vital tool for making sure their audio is audible. The most widely used sign language in the world, known as ALC—American Linguistic Communication—varies slightly depending on the nation. Deaf and mute people can communicate effectively by using hand gestures to convey their message. The wearable good glove we developed for this study will translate ALC motions into the proper alphabets and words. It makes use of a glove with a number of flex sensors on the fingers' distal and proximal interphalangeal joints as well as the metacarpophalangeal joint to detect finger bending. The complete system is divided into three units: a wearable hand glove unit with a flexible device that records user-created ALC gestures, a processing unit in charge of taking sensor data, and a final unit that uses a machine classifier to identify the appropriate alphabet. In order to receive known alphabet data in text form through a wired channel via the mobile "Sign to Speech App," which presented that text data into this app, the smartphone unit is linked to the processing unit. Its user-friendly design, low cost, and availability on mobile platforms give it an edge over traditional gesture language techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call