Abstract

The performance of correlation filter (CF) based visual trackers has been greatly improved with pretrained deep convolutional neural networks. However, these networks limit the application scope of CF based trackers because of high feature dimension, high time consumption of feature extraction and huge memory storage. To alleviate this problem, we introduce a teacher-student knowledge distillation framework to obtain a lightweight network to speed up CF based trackers. Specifically, we take a pretrained deep convolutional neural network from the image classification task as a teacher network, and distill this teacher network into a lightweight student network. During offline distillation training process, we propose an attention transfer loss to ensure the lightweight student network maintains feature representation of the large-capacity teacher network. Meanwhile, we propose a correlation tracking loss to transfer the student network from image classification task to correlation tracking task, which improves the discriminant ability of the student network. Experiments on OTB, VOT2017 and Temple Color show that, using the learned lightweight network model as the feature extractor, the state-of-the-art CF based tracker achieves real-time speed on a single CPU, while maintaining almost the same tracking performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.