Abstract

Light detection and ranging (LiDAR) and global navigation satellite system (GNSS)/inertial measurement unit (IMU) have been widely used in autonomous driving systems. LiDAR-GNSS/IMU calibration directly affects the performance of vehicle localization and perception. Current calibration methods require specific vehicle movements or scenarios with artificial calibration markers to keep the problem well-constrained, which are empirical, time-consuming, and poorly automated. To solve this problem, this article proposes a novel self-calibration method based on both relative and absolute motion constraints. Initial calibration parameters are calculated with relative motion constraints derived from LiDAR odometry. To eliminate the impact of odometry drift and enhance the observability of translation parameters, calibration parameters are iteratively refined by tightly coupling both relative and absolute motion constraints derived from scan-global map matching. Tests on simulation and ground datasets show that the proposed method is robust and accurate with RMSEs of <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$10^{-{3^{\circ} }}$ </tex-math></inline-formula> and <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$10^{-3}$ </tex-math></inline-formula> m for rotation and translation, respectively. Further mapping and localization experiments with calculated calibration parameters present a state-of-the-art absolute localization accuracy of about 3 cm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call