Abstract

In this paper, we propose a new approach, appearance-guided particle filtering (AGPF), for high degree-of-freedom visual tracking from an image sequence. This method adopts some known attractors in the state space and integrates both appearance and motion-transition information for visual tracking. A probability propagation model based on these two types of information is derived from a Bayesian formulation, and a particle filtering framework is developed to realize it. Experimental results demonstrate that the proposed method is effective for high degree-of-freedom visual tracking problems, such as articulated hand tracking and lip-contour tracking.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.