Abstract

A robust and accurate real-time navigation system is crucial for autonomous robotics. In particular, GNSS denied and poor visual conditions are still very challenging as vision based approaches tend to fail in darkness, direct sunlight, fog or smoke. Therefore, we are taking advantage of inertial data and FMCW radar sensors as both are not affected by such conditions. In this work, we propose a framework, which uses several 4D mmWave radar sensors simultaneously. The extrinsic calibration of each radar sensor is estimated online. Based on a single radar scan, the 3D ego velocity and optionally yaw measurements based on Manhattan world assumptions are fused. An extensive evaluation with real world datasets is presented. We achieve even better accuracies than state of the art stereo Visual Inertial Odometry (VIO) while being able to cope with degraded visual conditions and requiring only very little computational resources.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call