Abstract

Visual tracking is a fundamental component for high-level video understanding problems such as motion analysis, event detection and action recognition. Recently, Discriminative Correlation Filters (DCF) have achieved enormous popularity in the tracking community due to high computational efficiency and fair robustness. However, the underlying boundary effect of DCF leads to a very restricted target search region at the detection step. Generally, a larger search area is adopted to overcome this disadvantage. Such an expansion of search area usually includes substantial amount of background information which will contaminate the tracking model in realist tracking scenarios. To alleviate this major drawback, we propose a generic DCF tracking framework which suppresses background information and highlights the foreground object with an object likelihood map computed from the color histograms. This object likelihood map is merged with the cosine window and then integrated into the DCF formulation. Therefore, DCF are less burdened in the training step by focusing more on pixels with higher object likelihood probability. Extensive experiments on the OTB50 and OTB100 benchmarks demonstrate that our adaptively windowed tracking framework can be combined with many DCF trackers and achieves significant performance improvement.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call