Abstract

We present an adaptive skin color model for hand gesture tracking which is applied to cross-modal analysis of planning meetings. We build a skin color model with the Gaussian distribution and a skin color filter for each participant in meetings. By combining with the vector coherence mapping (VCM) algorithm, we track hand motion and obtain 3D trajectories. The hand gesture stream is extracted from hand motion trajectories. Different skin color models are created for different people to handle the differences of skin color. We update each model dynamically to adapt changes of environments. A parallel system has been implemented to track and extract hand motion trajectories. Examples of hand motion gesture tracking in meeting environments are provided. The applications of the adaptive skin color model can increase the speed of hand tracking

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call