Abstract

The goal of static hand gesture recognition is to classify the given hand gesture data represented by some features into some predefined finite number of gesture classes. This paper presents a novel technique for hand gesture recognition through human–computer interaction based on shape analysis. The main objective of this work is to explore the utility of two algorithms that are used for extracting the features to recognize the hand gestures. Artificial neural networks technique is used as a recognizer to assign these features to their respective hand gesture classes. The proposed system presents a recognition method to recognize a set of six specific static hand gestures, namely, Open, Close, Cut, Paste, Maximize, and Minimize. The hand gesture image is passed through three stages, preprocessing, feature extraction, and classification. In preprocessing stage, some operations are applied to extract the hand gesture from its background and prepare the hand gesture image for the feature extraction stage. In the first method, the hand contour is used as a feature which treats scaling and translation of problems (in some cases). The complex moment algorithm is, however, used to describe the hand gesture and treat the rotation problem in addition to the scaling and translation. The classification algorithm used is a multi-layer neural network classifier which uses back-propagation learning algorithm. The results show that the first method has a performance of 70.83% recognition, while the second method, proposed in this article, has a better performance of 86.38% recognition rate.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call