Abstract

Recently, Discriminative Correlation Filters (DCF) have shown excellent performance in visual object tracking. The correlation for a computing response map can be conducted efficiently in Fourier domain by Discrete Fourier Transform (DFT) of inputs, where the DFT of an image has symmetry on the Fourier domain. To enhance the robustness and discriminative ability of the filters, many efforts have been devoted to optimizing the learning process. Regularization methods, such as spatial regularization or temporal regularization, used in existing DCF trackers aim to enhance the capacity of the filters. Most existing methods still fail to deal with severe appearance variations—in particular, the large scale and aspect ratio changes. In this paper, we propose a novel framework that employs adaptive spatial regularization and temporal regularization to learn reliable filters in both spatial and temporal domains for tracking. To alleviate the influence of the background and distractors to the non-rigid target objects, two sub-models are combined, and multiple features are utilized for learning of robust correlation filters. In addition, most DCF trackers that applied 1-dimensional scale space search method suffered from appearance changes, such as non-rigid deformation. We proposed a 2-dimensional scale space search method to find appropriate scales to adapt to large scale and aspect ratio changes. We perform comprehensive experiments on four benchmarks: OTB-100, VOT-2016, VOT-2018, and LaSOT. The experimental results illustrate the effectiveness of our tracker, which achieved a competitive tracking performance. On OTB-100, our tracker achieved a gain of 0.8% in success, compared to the best existing DCF trackers. On VOT2018, our tracker outperformed the top DCF trackers with a gain of 1.1% in Expected Average Overlap (EAO). On LaSOT, we obtained a gain of 5.2% in success, compared to the best DCF trackers.

Highlights

  • An object tracking algorithm aims to track the object’s position in a 2D or 3D input, such as wireless signal, radar, or camera (i.g., video frame)

  • The Bluetooth 5.1 Direction Finding standard provides the probability of high-precision and real-time tracking of targets based on Angle of Departure (AoD) and the Angle of Arrival (AoA) [1]

  • The Discriminative Correlation Filter (DCF) for tracking has attracted a lot of attention due to its efficiency and effectiveness, from the trackers based on handcrafted features [2,3,4] to the trackers that exploit learning with deep features [5,6,7]

Read more

Summary

Introduction

An object tracking algorithm aims to track the object’s position in a 2D or 3D input, such as wireless signal, radar (i.e., a radar echo), or camera (i.g., video frame). As the main topic of this paper, is an important area in computer vision, which estimates the trajectory of the target object with visual information from a video sequence. Visual object tracking can be applied into many applications, such as video surveillance, motion analyses, human computer interaction, automatic robot navigation, and traffic monitoring. The Discriminative Correlation Filter (DCF) for tracking has attracted a lot of attention due to its efficiency and effectiveness, from the trackers based on handcrafted features [2,3,4] to the trackers that exploit learning with deep features [5,6,7]. The core idea in DCF trackers is how to learn robust and discriminative filters to adapt to appearance changes online by minimizing a least-squares loss for all circular shifts of a training sample.

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call