Abstract

This paper deals with the problem of modeling spatial uncertainty of point features in feature-based RGB-D SLAM. Although the feature-based approach to SLAM is very popular, in the case of systems using RGB-D data the problem of explicit uncertainty modeling is largely neglected in the implementations. Therefore, we investigate the influence of the uncertainty models of point features on the accuracy of the estimated trajectory and map. We focus on the recent SLAM formulation employing factor graph optimization. Unlike some visual SLAM systems employing factor graph optimization that minimize the reprojection errors of features, we explicitly use depth measurements and minimize the errors in the 3-D space. The paper analyzes the impact of the information matrices used in factor graph optimization on the achieved accuracy. We introduce three different models of point feature spatial uncertainty. Then, applying the most simple model, we demonstrate in simulations how important is the influence of the spatial uncertainty model on the graph optimization results in an idealized SLAM system with perfect feature matching. A novel software tool allows us to visualize the statistical behavior of the features over time in a real SLAM system. This enables the analysis of the distribution of feature measurements employing synthetic RGB-D data processed in an actual SLAM pipeline. Finally, we show on publicly available real RGB-D datasets how an uncertainty model, which reflects the properties of the RGB-D sensor and the image processing pipeline, improves the accuracy of sensor trajectory estimation.

Highlights

  • – we introduce a new uncertainty analysis methodology based on software tools that allow us to simulate RGB-D simultaneous localization and mapping (SLAM) systems and to analyze the behavior of point features; these tools, in turn, make it possible to understand the nature of the spatial uncertainty of the point features in RGB-D SLAM;

  • Whereas we focused on the local improvement of the trajectory, PUT SLAM with the proposed uncertainty model Cg provides globally consistent trajectories, which is proved by small Absolute Trajectory Error (ATE) RMSE values obtained using the TUM RGB-D benchmark

  • – new uncertainty models of point features, which incorporate the axial and lateral spatial uncertainty caused by the PrimeSense technology RGB-D sensors, but take into account image processing uncertainties in the whole SLAM front-end pipeline;

Read more

Summary

Motivation

Feature-to-pose measurements can be used to find coordinates of features and relative sensor motion using the Bundle Adjustment (BA) approach [42] This approach is applied in the most successful visual SLAM systems: PTAM [21], ORB- SLAM [29] and ORB-SLAM2 [30] to obtain accurate trajectories of the camera. RGB-D SLAM systems use the graph-based optimization framework to integrate all feature-to-pose measurements [12,41]. Our approach to RGB-D SLAM exploits to a greater extent the depth measurements, and it seems to be a better vehicle to demonstrate the role of modeling the uncertainty of RGBD measurements than other systems of similar architecture, such as ORB-SLAM2 [30], which employs triangulation of feature positions and uses reprojection errors even for RGB-D data containing dense depth frames. In this work, we investigate how to improve the accuracy of sensor trajectory estimation by explicitly modeling spatial uncertainty of the point features

Problem statement
Contribution
Related work
RGB-D SLAM with a map of features
SLAM front-end
Spatial uncertainty modeling
Uncertainty propagation from the sensor model
Normal-based uncertainty model
Experiments in Simulation
Method
Reverse SLAM: looking closer at the features
Evaluation of the Uncertainty Models
Impact of the uncertainty models on the SLAM accuracy
Application-oriented evaluation
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.