Abstract

As one of the core contents of intelligent monitoring, target tracking is the basis for video content analysis and processing. In visual tracking, due to occlusion, illumination changes, and pose and scale variation, handling such large appearance changes of the target object and the background over time remains the main challenge for robust target tracking. In this paper, we present a new robust algorithm (STC-KF) based on the spatio-temporal context and Kalman filtering. Our approach introduces a novel formulation to address the context information, which adopts the entire local information around the target, thereby preventing the remaining important context information related to the target from being lost by only using the rare key point information. The state of the object in the tracking process can be determined by the Euclidean distance of the image intensity in two consecutive frames. Then, the prediction value of the Kalman filter can be updated as the Kalman observation to the object position and marked on the next frame. The performance of the proposed STC-KF algorithm is evaluated and compared with the original STC algorithm. The experimental results using benchmark sequences imply that the proposed method outperforms the original STC algorithm under the conditions of heavy occlusion and large appearance changes.

Highlights

  • While target tracking is one of the noteworthy and active research areas in the field of computer vision and machine learning, many challenges are still not resolved [1].Researchers have proposed many different tracking algorithms for possible occlusion, illumination changes, and pose variation during target tracking

  • In order to enhance the stability of the Kalman filter algorithm in the target tracking process, Pouya et al [13] proposed a tracking algorithm based on Kanade–Lucas–Tomasi (KLT) and Kalman filtering, using KLT to track targets and estimating the Mathematics 2019, 7, 1059; doi:10.3390/math7111059

  • We evaluate the proposed spatio-temporal context (STC)-KF tracking algorithm using three representative benchmarks of OTB50/100

Read more

Summary

Introduction

While target tracking is one of the noteworthy and active research areas in the field of computer vision and machine learning, many challenges are still not resolved [1]. Researchers have proposed many different tracking algorithms for possible occlusion, illumination changes, and pose variation during target tracking. Most of these algorithms adopt template matching [2,3], small facet tracking [4,5], particle filtering [6,7], sparse representation [8,9], contour modeling [10], and image segmentation [11]. In order to enhance the stability of the Kalman filter algorithm in the target tracking process, Pouya et al [13] proposed a tracking algorithm based on Kanade–Lucas–Tomasi (KLT) and Kalman filtering, using KLT to track targets and estimating the Mathematics 2019, 7, 1059; doi:10.3390/math7111059 www.mdpi.com/journal/mathematics

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call