Abstract

This paper presents a Real-Time Bird's Eye View Multi Object Tracking (MOT) system pipeline for an Autonomous Electric car, based on Fast Encoders for object detection and a combination of Hungarian algorithm and Bird's Eye View (BEV) Kalman Filter, respectively used for data association and state estimation. The system is able to analyze 360 degrees around the ego-vehicle as well as estimate the future trajectories of the environment objects, being the essential input for other layers of a self-driving architecture, such as the control or decision-making. First, our system pipeline is described, merging the concepts of online and real-time DATMO (Deteccion and Tracking of Multiple Objects), ROS (Robot Operating System) and Docker to enhance the integration of the proposed MOT system in fully-autonomous driving architectures. Second, the system pipeline is validated using the recently proposed KITTI-3DMOT evaluation tool that demonstrates the full strength of 3D localization and tracking of a MOT system. Finally, a comparison of our proposal with other state-of-the-art approaches is carried out in terms of performance by using the mainstream metrics used on MOT benchmarks and the recently proposed integral MOT metrics, evaluating the performance of the tracking system over all detection thresholds.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call