Abstract

Abstract Automated vehicles rely on a precise intrinsic and extrinsic calibration of all sensors. An exact calibration leads to accurate localization and object detection results. Especially for sensor data fusion, the transformation between different sensor frames must be well known. Moreover, modular and redundant platforms require a large number of sensors to cover their full surroundings. This makes the calibration process complex and challenging. In this article, we describe the procedure to calibrate the full sensor setup of a modular autonomous driving platform, consisting of camera, lidar, and radar sensors, in four subsequent steps. At first, the intrinsic and extrinsic camera parameters are determined. Afterwards, the transformation from lidar to camera on the one hand and from lidar to radar on the other hand is estimated. Lastly, the extrinsic calibration between all lidars and the vehicle frame is performed. In our evaluation, we show that these steps lead to an accurate calibration of the complete vehicle.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call