Abstract

There are a lot of people who have many disabilities in our world out of which,people who are deaf and dumb cannot convey there messages to the normal people. Conversation becomes very difficult for this people. Deaf people cannot understand and hear what normal people is going to convey ,similarly dumb people need to convey their message using sign languages where normal people cannot understand unless he/she knows or understands the sign language. This brings to a need of an application which can be useful for having conversation between deaf,dumb and normal people. Here we are using hand gestures of Indian sign language (ISL) which contain all the alphabets and 0-9 digit gestures. The dataset of alphabets and digits is created by us.After dataset building we extracted the features using bagof- words and image preprocessing.With the feature extraction, histograms are been generated which maps alphabets to images. Finally, these features are fed to the supervised machine learning model to predict the gesture/sign. We did also use CNN model for training the model.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.