Abstract

Tracking moving human from a monocular vision using visual tracking is a useful skill for the coming generation of human-machine interface. It is a challenging problem of planning and control in dynamic environment. The methods used in existing moving human tracking that operate from fixed platform or fixed background, are not applicable. In this paper, we propose a visual tracking approach that detects unexpected moving human that appears in the scene of a monocular camera. This method is implemented using a novel combination of multi-features tracking algorithm based on the particle filter, which allows robust and accurate visual tracking under the circumstance of real-time visual tracking. We apply a particle filter implementation to human tracking using multi-features observation that exploits skin and head-and-shoulder boundary as its prior density. Our experimental results show that the relevance of our approach to the problem of human tracking are then investigated. The accuracy and robustness were evaluated and compared on a challenging synthetic tracking problem using real visual tracking experiments. As a result, it can be possible to track robustly human's motion in the complex environment with moving camera.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call