Abstract

In recent years, discriminative correlation filter (DCF)-based trackers have made considerable progress and drawn widespread attention in the unmanned aerial vehicle (UAV) tracking community. Most existing trackers collect historical information, e.g., training samples, previous filters, and response maps, to promote their discrimination and robustness. Under UAV-specific tracking challenges, e.g., fast motion and view change, variations of both the target and its environment in the new frame are unpredictable. Interfered by future unknown environments, trackers that trained with historical information may be confused by the new context, resulting in tracking failure. In this paper, we propose a novel future-aware correlation filter tracker, i.e., FACF. The proposed method aims at effectively utilizing context information in the new frame for better discriminative and robust abilities, which consists of two stages: future state awareness and future context awareness. In the former stage, an effective time series forecast method is employed to reason a coarse position of the target, which is the reference for obtaining a context patch in the new frame. In the latter stage, we firstly obtain the single context patch with an efficient target-aware method. Then, we train a filter with the future context information in order to perform robust tracking. Extensive experimental results obtained from three UAV benchmarks, i.e., UAV123_10fps, DTB70, and UAVTrack112, demonstrate the effectiveness and robustness of the proposed tracker. Our tracker has comparable performance with other state-of-the-art trackers while running at ∼49 FPS on a single CPU.

Highlights

  • Visual object tracking is a popular but challenging task in the domain of multimedia and computer vision

  • With the popularity of unmanned aerial vehicles (UAVs), visual tracking applied for UAV platforms has attracted extensive attention, e.g., public security [1], disaster investigation [2], and remote sensor mounting [3]

  • With respect to the above concerns, we propose a two-stage correlation filter tracker that can efficiently exploit the contextual information of the upcoming frame

Read more

Summary

Introduction

Visual object tracking is a popular but challenging task in the domain of multimedia and computer vision. The achievement of this purpose depends on two irreversible future-aware stages, i.e., future state awareness and future context awareness The former stage is for predicting the spatial location change of the target in the upcoming frame, and the latter is for suppressing distractions caused by future complex background while enhancing filter discriminative power. A coarse-to-fine DCF-based tracking framework is proposed to exploit the context information hidden in the frame that is to be detected; Single exponential smoothing forecast is used to provide a coarse position, which is the reference for acquiring a context patch; We obtain a single future-aware context patch through an efficient target-aware mask generation method without additional feature extraction; Experimental results on three UAV benchmarks verify the advancement of the proposed tracker. The remainder of this paper is organized as follows: Section 2 generalizes the most relevant works; Sections 3 introduces the baseline tracker; Section 4 details the proposed method; Section 5 exhibits extensive and comprehensive experiments; and Section 6 provides a brief summary of this work

DCF-Based Trackers
Trackers with Context Learning
Trackers with Future Informarion
Trackers for UAVs
Revisit BACF
Problem Formulation
Stage One
Fast Context Acquisition
Filter Training
Object Detection
Tracking Procedure
Experiments
Parameters
Benchmarks
Metrics
Comparison With Handcrafted-Based Trackers
Comparison with Deep-based Trackers
The Impact of Key Parameter
The Vality of Component
The Strategy for Context Learning
Failure Cases
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call