Abstract
Individuals facing speech impairments encounter significant communication challenges. To address this, we have developed a groundbreaking system that seamlessly translates custom hand gestures into written text and spoken language. Our system combines flex sensors and state-of-the-art machine learning techniques, empowering individuals with speech impairments to express themselves effectively and improve their overall communication experience. By connecting flex sensors to an Arduino board and utilizing MATLAB for robust data collection, analysis, and processing, we achieve a highly accurate gesture recognition system. Through meticulous steps of data labeling, rigorous training of a machine learning model, and thorough accuracy evaluations, our system recognizes a wide range of gestures, including numbers 0-9, letters A-Z, and 10 Kannada language gestures conveying meaningful phrases. The hand gesture recognition system achieved notable performance with an accuracy of 83.5%, precision of 88.6%, recall of 95.1%, and F1 Score of 91.7%. Key Words: speech impairments, hand gestures, gesture recognition, flex sensors,
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: INTERANTIONAL JOURNAL OF SCIENTIFIC RESEARCH IN ENGINEERING AND MANAGEMENT
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.