Abstract
Achieving accurate object tracking against noise spoofing interference caused by similar backgrounds, similar objects, occlusion, or illumination variations is a challenge, especially since Siamese trackers largely ignore the spatial-temporal and part-level information of object structure. To handle this problem, we present a novel object tracking network with spatial-temporal awareness, which learns the correlation of local feature changes and extracts global spatial-temporal fusion features of the object to prevent noise spoofing interference. Specifically, our tracker integrates an object graph reconstruction representation module and a spatial-temporal graph Transformer module. The graph reconstruction representation module models the object structure and search region as part-to-part graph node correspondences, propagating object information to achieve part-level feature aggregation in the search region. The proposed spatial-temporal graph Transformer module fuses the temporal and spatial features of the object to enhance the contextual features and correlation between temporal and spatial feature variations, recognizing the noise spoofing interference to enhance the tracking performance. Meanwhile, our tracker learns object motion information to improve object state awareness, enhancing tracking accuracy under similar objects interference. Extensive experiments on six public datasets validate that our tracker outperforms related state-of-the-art trackers and achieves accurate tracking results against noise spoofing interference.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.