Abstract

Gesture recognition is one of the important tasks for human Robot Interaction (HRI). This paper describes a novel system intended to recognize 3D dynamic gestures based on depth information provided by Kinect sensor. The proposed system utilizes tracking for the upper body part and combines the hidden Markov models (HMM) and dynamic time warping (DTW) to avoid gestures misclassification. By using the skeleton algorithm provided by the Kinect SDK, body is tracked and joints information are extracted. Each gesture is characterized by one of the angles which remains active when executing it. The variations of the angles throughout the gesture are used as inputs of Hidden Markov Models (HMM) in order to recognize the dynamic gestures. By feeding the output of (HMM) back to (DTW), we achieved good classification performances without any misallocation. Besides that, using depth information only makes our method robust against environmental conditions such as illumination changes and scene complexity.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call