Abstract

Object tracking has made significant progress in recent years. However, the state-of-the-art trackers are becoming increasingly heavy and expensive, making their deployment challenging in resource-constrained applications. In this study, we introduce MobileTrack, a visual object tracker that strikes a perfect balance between tracking accuracy and inference speed. Utilizing a novel coordinated perception-aware fusion module and a lightweight prediction head, our proposed methodology outperforms most Siamese trackers on various academic benchmarks in terms of both accuracy and efficiency. When deployed on resource-constrained embedded devices such as NVIDIA Jetson TX2, MobileTrack ensures real-time performance at a speed exceeding 33FPS, while LightTrack only operates at 18FPS. Therefore, MobileTrack holds significant potential to unlock a wide range of practical applications across various industries. MobileTrack is released at here.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call