Abstract

Tracking the articulated human body has been a difficult research because body poses change so dynamic and vary in visual appearance. Pictorial Structures (PS) with dynamic programming (particle filtering) has been widely used for tracking human body, which is highly articulated and moves dynamically. In this paper, we use PS and a particle filter for upper body tracking. However, a Markov-process-based dynamic motion model for particle filtering cannot adequately predict the particles. We propose a key-pose-based proposal distribution that uses similarities between the input silhouette image and the key poses to effectively predict the particles. We select relatively few example poses from the pose space as key poses, train for embedded features, and formulate the proposal distribution with key pose similarities and a Markov-process-based dynamic model. We experimentally evaluate our proposal method and an observation model and test gesture recognition for human-robot interaction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call