Abstract
The challenges of rotation, occlusion, and scale change in visual object tracking have been an urgent problem to be solved. In recent years, the online learning ability and discriminative power of convolutional neural network (CNN) features have gained great attention in many computer vision tasks. Previous studies have shown that individual layer CNN features can be used for dealing with specific tracking challenges, such as target accurate positioning, rotation, and deformation. However, tracking with single CNN features is not enough to deal with serious challenges, such as the above-mentioned. To handle this problem, we evaluate the contribution of specific CNN layers features in tracking tasks and present a complementary CNN features tracking framework, which treats tracking procedure as a deep CNN features optimizing. Our tracker does not merely fuse the CNN features but is a semantic level method, which adds a competition mechanism to optimize the target appearance encoding of CNN layers. Therefore, the proposed tracker is robust not only to target location but also to rotation and deformation. We also embed hard negative mining to enhance the discriminative power of the tracking model and use bounding box regression to refine the tracking result. Having compared our tracker performance with other state-of-the-art trackers, the obtained experimental results in a large-scale dataset demonstrate that the effectiveness of our proposed tracker outperforms state-of-the-art ones.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.