Abstract
This article proposes a method for sensor calibration and obstacle detection in urban environments. The radar and 3D LIDAR were calibrated with relation to a stereo camera. Data from the radar, 3D LIDAR and stereo camera sensors were fused for the detection of obstacles and determination of their shapes. The data collected using both camera and 3D LIDAR are a point cloud that is segmented taking the radar detections as obstacle hypotheses. This sensor fusion approach decreases the processing, improves the obstacle detection and reduces false positives.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have