Abstract

Understanding hand gestures has been crucial to mankind since the dawn of time, due to various purposeful reasons contemporaneous with other things of the same era. The recognition of hand gestures is greatly applied for the understanding of sign languages, which serves as the primary mode of communications for deaf and dumb individuals. Recognition of universal and customised hand gestures also bridges the gap between those beings who are well versed with sign language and individuals who are incapable of understanding it. Apart from this, understanding of hand gestures has vast applications in the fields of automation, virtual reality, provision of a touchless user interface and various such modes of Human Computer Interactions. In this paper we have proposed a hybrid architecture for Hand Gesture Recognition involving Mediapipe for the detection and tracking of the hand along with various neural network architectures for training and classifying the hand gestures. A sub-sample of the publicly available HaGRID (Hand Gesture Recognition) dataset has been considered that involves eighteen hand gestures and a no-gesture category. A gesture classification model has been created by making use of transfer learning techniques on DenseNet201 deep neural network architecture which gave a validation loss of 0.115 and validation accuracy of 97.55%.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.