Abstract

Accurate localization is a key component for autonomous robotics and various navigation tasks. Navigation in GNSS denied and visually degraded environments is still very challenging. Approaches based on visual sensors tend to fail in poor visual conditions such as darkness, fog or smoke. Therefore, our approach fuses inertial data with FMCW radar scans which are not affected by such conditions. We extend a filter based approach to 3D Radar Inertial Odometry with yaw aiding using Manhattan world assumptions for indoor environments. Once initialized, our approach enables instantaneous yaw aiding using only a single radar scan. The proposed system is evaluated in various indoor datasets with carried and drone datasets. Our approach achieves similar accuracies as state of the art Visual Inertial Odometry (VIO) while being able to cope with degraded visual conditions and requiring only very little computational resources. We achieve run-times many times faster than real-time even on a small embedded computer. This approach is well suited for online navigation of drones as demonstrated in indoor flight experiments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.