Abstract

Localization of a mobile robot in the absence of an absolute position sensor often relies on techniques such as visual or lidar-inertial odometry. While lidar has many advantages, the most capable sensors use scanning mechanisms, leading to motion-distorted scans. Previous strategies used to account for robot motion when performing state estimation and outlier rejection have drawbacks for use on highly dynamic, resource-constrained robots such as spacecraft during descent and landing. In this paper we develop a novel probabilistic factor for the inclusion of scanning lidar features, and an accompanying outlier rejection methodology. By using well-established, efficient feature tracking techniques, our image processing front end is both reliable and amenable to FPGA implementation, both of which are critical for operation on a spacecraft. We demonstrate our technique on a dataset from simulated planetary descent and landing. The results show that our system can be used to perform accurate lidar-inertial odometry, even in highly dynamic scenarios.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call