Abstract

In recent years, the use of human movements, especially hand gestures, serves as a motivating force for research in gesture modeling, analyzing and recognition. Hand gesture recognition provides an intelligent, natural, and convenient way of human-robot interaction (HRI). According to the way of the input of gestures, the current gesture recognition techniques can be divided into two categories: based on the vision and based on the data gloves. In order to cope with some problems existed in currently data glove. In this paper, we use a novel data glove called YoBu to collect data for gesture recognition. And we attempt to use extreme learning machine (ELM) for gesture recognition which has not yet found in the relevant application. In addition, we analyzed which features play an important role in classification and collect data of static gestures as well as establish a gesture dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call