Abstract

Modern multi-object tracking (MOT) systems usually build trajectories through associating per-frame detections. However, facing the challenges of camera motion, fast motion, and occlusion, it is difficult to ensure the quality of long-range tracking or even the tracklet purity, especially for small objects. Most of tracking frameworks depend heavily on the performance of re-identification (ReID) for the data association. Unfortunately, the ReID-based association is not only unreliable and time-consuming, but still cannot address the false negatives for occluded and blurred objects, due to noisy partial-detections, similar appearances, and lack of temporal-spatial constraints. In this paper, we propose an enhanced MOT paradigm, namely Motion-Aware Tracker (MAT). Our MAT is a plug-and-play solution, it mainly focuses on high-performance motion-based prediction, reconnection, and association. First, the nonrigid pedestrian motion and rigid camera motion are blended seamlessly to develop the Integrated Motion Localization (IML) module. Second, the Dynamic Reconnection Context (DRC) module is devised to guarantee the robustness for long-range motion-based reconnection. The core ideas in DRC are the motion-based dynamic-window and cyclic pseudo-observation trajectory filling strategy, which can smoothly fill in the tracking fragments caused by occlusion or blur. At last, we present the 3D Integral Image (3DII) module to efficiently cut off useless track-detection association connections using temporal-spatial constraints. Extensive experiments are conducted on the MOT16&17 challenging benchmarks. The results demonstrate that our MAT can achieve superior performance and surpass other state-of-the-art trackers by a large margin with high efficiency.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.