Abstract

Human Computer Interface is the study of how humans and computers interact. Hand gestures are a great way to communicate with people when they don’t understand exactly what we are saying. Understanding hand gestures is essential to make sure the listener understands what we are saying. The main idea of our project is to try different approaches to hand gesture recognition. This proposed work first with radar data and then with camera sensor to achieve hand gesture recognition. First, we tried to build hand gesture recognition using radar data, and since most people don’t know sign language and few interpreters, we developed an approach to real-time approach for American Sign Language based on neural networks finger spelling followed by another model with Media Pipe. We propose a complex neural network method to detect hand gestures of human behaviour from camera recorded images. The hand gesture first goes through the filter and after applying the filter the gesture goes through a classifier that predicts which type of hand gesture it is. In an existing system radar unable to detect static gestures in our approach, a deep learning-based image captioning algorithm captures both static and dynamic gestures through Media Pipe.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.