Abstract

Incorrect landmark and loop closure measurements can cause standard SLAM algorithms to fail catastrophically. Recently, several SLAM algorithms have been proposed that are robust to loop closure errors, but it is shown in this paper that they cannot provide robust solutions when landmark measurement errors occur. The root cause of this problem is that the robust SLAM algorithms only focus on generating solutions that are locally consistent (i.e. each measurement agrees with its corresponding estimates) rather than globally consistent (i.e. all of the measurements in the solution agree with each other). Moreover, these algorithms do not attempt to maximize the number of correct measurements included in the solution, meaning that often correct measurements are ignored and the solution quality suffers as a result. This paper proposes a new formulation of the robust SLAM problem that seeks a globally consistent map that also maximizes the number of measurements included in the solution. In addition, a novel incremental SLAM algorithm, called incremental SLAM with consistency-checking, is developed to solve the new robust SLAM problem. Finally, simulated and experimental results show that the new algorithm significantly outperforms state-of-the-art robust SLAM methods for datasets with incorrect landmark measurements and can match their performance for datasets with incorrect loop closures.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call