Abstract
The sign language is a language of communication for the speech and hearing impaired people used for their daily conversations and needs. This research paper give away, a creative framework, aims to accomplish the transfiguration of 24 continuous real time alphabet gestures of American Sign Language into human and / or device recognizable English text. The real time alphabets gestures take into consideration for testing are absolutely invariant to location, background, illumination, skin color, gender, and distance in nature. The real time hand gesture recognition process is conceded using the regular web camera. Histogram of Oriented Gradients (HOG) is used for hand feature extraction after the preprocessing operations. As a result of this research work, experimentation is carried out using the K-Nearest Neighbor (KNN) classifier for classification and recognition of gestures with k=3, 5, and 7. Finally, yields an overall average recognition rate of continuous real time alphabet gestures is 98.44%using KNN Classifier (k=3) with the least recognition time of 0.38 seconds per gesture, which is excellent comparing with the proposed technique k=5 (accuracy 93.75%, time elapsed 0.42 seconds), and k=7 (accuracy 90.10%, time elapsed 0.45 seconds), and also with the state of art techniques in real time environment.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.