Abstract

Object tracking is challenged by the varying appearances of targets and the real-time requirement. Siamese regression trackers, being one of the most popular tracking paradigms, excel in efficiency but suffer at adaptability to cope with appearance variations. To improve their adaptability, the authors propose a new adaptive Siamese (ASiam) tracker, which integrates a novel adversarial template generation module and a motion-based failure recovery module. The template generation module exploits the temporal coherence and evolution of target appearance variations encoded in preceding tracklets and then generates an adaptive target template online which approximates the varying target in the coming frame. This generation module is optimised via adversarial learning to achieve accurate appearance prediction and sharp template quality. The generated template, together with a search region, are fed into a Siamese tracking backbone to compute an appearance response map via dense similarity computation in a sliding-window way. At frames where the Siamese tracking fails, the failure recovery module is invoked to perform deep frame differencing motion detection to provide a motion response map. By fusing different response maps, the drifted tracker can be re-calibrated. Extensive experiments on the OTB2013, OTB2015, and VOT2016 datasets prove the accuracy and efficiency of the proposed tracker.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.