Abstract

In this paper, we present image-based methods for robust recognition of static and dynamic hand gestures in real-time. These methods are used for an intuitive interaction with an assistance-system in which the skin-tones are used to segment the hands. The segmentation builds the basis of feature extraction for the static and dynamic gestures. In the static gestures, the activation of particular region leads us to associated actions whereas HMM classifier is used to extract the dynamic gestures dependent upon the flow. The assistance-system supports the workers in manual working tasks in the context of assembling complex products. This paper is focused on the interaction of the user with this system and describes the work in progress with the initial results from an application scenario.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call