Abstract

Object tracking is an important issue in many practical computer vision applications, such as video surveillance, self-driving,and social scene understanding. Although the traditional correlation filter has been achieved the great performance in tracking accuracy and speed in a specific scenario, there are still some defects, such as weak robustness of trackers caused by using the single feature, boundary effects due to the circular shift and model corruption produced by the model update. To address the above problems, a visual tracking algorithm via confidence-based multi-feature correlation filtering is proposed in this paper. It adaptively selects histogram of oriented gradient (HOG) features or fusion features according to the confidence to improve the robustness and speed of target tracking. Firstly, a confidence level is proposed to evaluate the reliability of HOG feature based on the response map of the HOG feature. Secondly, a selective multi-feature fusion method is proposed to improve the robustness of the tracking algorithm. Thirdly, a novel model-updating mechanism, called model rollback mechanism, is proposed to reduce the impact of the model corruption. The algorithm is evaluated on the public datasets and compared with several state-of-the-art algorithms. Experimental results show that the proposed algorithm can effectively improve the performance in tracking accuracy of tracker in the above problems and is superior to the state-of-the-art tracking algorithms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.