Abstract

The purpose of this effort is to promote equitable environments and allow people with hearing disabilities to communicate in their native language by proposing an AI-based sign language translator. We used a transformer neural network, which can analyze over 500 data points from a person's face and gestures, to translate sign language into text. The translator can expand, produce new datasets, and build models for sign language recognition thanks to our machine learning process. As a proof of concept, we developed an interpreter for emergency calls using more than 200 sign language words. The main goal is to empower people who are deaf to participate in social, political, economic, and cultural spheres of life. We see a lot of people with illnesses, including blindness, deafness, and dumbness, every day. They have trouble interacting with other people. The suggested method can translate sign language into text and voice since this study describes two-way communication between deaf, dumb, and normal individuals. Key Words: Sign Language, Inclusion, Social Development, Artificial Intelligence, Machine Learning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call