Abstract

Individuals facing speech impairments encounter significant communication challenges. To address this, we have developed a groundbreaking system that seamlessly translates custom hand gestures into written text and spoken language. Our system combines flex sensors and state-of-the-art machine learning techniques, empowering individuals with speech impairments to express themselves effectively and improve their overall communication experience. By connecting flex sensors to an Arduino board and utilizing MATLAB for robust data collection, analysis, and processing, we achieve a highly accurate gesture recognition system. Through meticulous steps of data labeling, rigorous training of a machine learning model, and thorough accuracy evaluations, our system recognizes a wide range of gestures, including numbers 0-9, letters A-Z, and 10 Kannada language gestures conveying meaningful phrases. The hand gesture recognition system achieved notable performance with an accuracy of 83.5%, precision of 88.6%, recall of 95.1%, and F1 Score of 91.7%. Key Words: speech impairments, hand gestures, gesture recognition, flex sensors,

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call