Abstract

Object tracking is a challenging research task because of drastic appearance changes of the target and a lack of training samples. Most online learning trackers are hampered by complications, e.g., drifting problem under occlusion, being out of view, or fast motion. In this paper, a real-time object tracking algorithm termed “robust sum of template and pixel-wise learners” (rStaple) is proposed to address those problems. It combines multi-feature correlation filters with a color histogram. Firstly, we extract a combination of specific features from the searching area around the target and then merge feature channels to train a translation correlation filter online. Secondly, the target state is determined by a discriminating mechanism, wherein the model update procedure stops when the target is occluded or out of view, and re-activated when the target re-appears. In addition, by calculating the color histogram score in the searching area, a significant enhancement is adopted for the score map. The target position can be estimated by combining the enhanced color histogram score with the correlation filter response map. Finally, a scale filter is trained for multi-scale detection to obtain the final tracking result. Extensive experimental results on a large benchmark dataset demonstrates that the proposed rStaple is superior to several state-of-the-art algorithms in terms of accuracy and efficiency.

Highlights

  • Object tracking has been widely used in the field of computer vision [1], such as in automatic driving, precision guidance, and video surveillance

  • Sci. 2020, 10, 3021 termed robust sum of template and pixel-wise learners (Staple), is proposed. It can simultaneously improve tracking accuracy while maintaining real-time performance based on the foundation in Staple of a combination of correlation filters and a color histogram, which is inherently robust to both color changes and deformations

  • Compared to the baseline Staple, the distance precision rate increases by 6.6%, the overlap success rate increases by 4.9%, and the average center location error increases from 32 to 18.9 pixels according to the results

Read more

Summary

Introduction

Object tracking has been widely used in the field of computer vision [1], such as in automatic driving, precision guidance, and video surveillance. Sci. 2020, 10, 3021 termed robust Staple (rStaple), is proposed It can simultaneously improve tracking accuracy while maintaining real-time performance based on the foundation in Staple of a combination of correlation filters and a color histogram, which is inherently robust to both color changes and deformations. Staple adopts a scheme to update learned filters each frame for handling the target appearance variations over time. Such a scheme tends to bring about model drift due to occlusion or being out of view. We adopt a detection mechanism which can effectively detect despite severe temporary occlusion or when the target is missing in the current frame In this way, the correlation filters terminate the model update.

Related Work
Kernelized Correlation Filter
Color Histogram
Proposed rStaple Tracking Method
Feature Fusion
Histogram Significant Enhancement
Scale Estimation
Adaptive Model Update
Implementation Details
Experimental Results
Evaluation Metrics
Overall Performance
Precision plots plots and success plotsplots of OPE
Specific Analysis
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.