Abstract

In tracking, one of the major challenges comes from handling appearance variations caused by changes in scale, pose, illumination and occlusion. In this paper, we propose a novel Bayesian Hierarchical Appearance Model (BHAM) for robust object tracking. Our idea is to model the appearance of a target as a combination of multiple appearance models, each covering the target appearance changes under a given view angle. Specifically, target instances are modeled by Dirichlet Process and dynamically clustered based on their visual similarity. Thus, BHAM provides an infinite nonparametric mixture of distributions that can grow automatically with the complexity of the appearance data. We built an object tracking system by integrating BHAM with background subtraction and the KLT tracker. Our experimental results on real-world videos show that our system has superior performance when compared with several state-of-the-art trackers.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call