Abstract

Multiple object detection and tracking is a fundamental part of scene understanding in the application of self-driving, mobile robot and other unmanned system. Generally, 2D image-based and 3D point based are two main kind of input information and the 3D points may come from RGBD camera, RADAR or LiDAR. We will focus on the 3D LiDAR point cloud-based method in this paper. Since the depth information is lost during the process of image capturing, and vulnerable to the illumination change, 2D image-based multi-object detection and tracking (MOT) methods still exist great challenges for real application. 3D LiDAR based object detection and tracking methods show some advantages over 2D image-based methods in terms of robustness and accuracy. However, in most existing literatures, the methods of 3D object detection and tracking are usually studied separately. Therefore, a compact tracking-by-detection based 3D multiple object tracking (3D MOT) method is proposed. Following the framework of tracking-by-detection method which usually adopted in 2D image-based MOT method, a two-stage based 3D object detection is employed to improve the detection accuracy and robustness, which is followed by an efficient 3D Kalman filter and Hungarian based tracker. The experiments confirm that the presented method is efficient for real-time application and reliable in object tracking, which achieved excellent performance in 3D MOT on KITTI dataset.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.