Abstract

RGB-D dense mapping systems are widely used for indoor navigation and augmented reality. However, the drift in egomotion estimation systems accumulates as the systems expand over space and time. In this paper, we develop a vertex-to-edge weighted closed-form algorithm to reduce camera drift for dense RGB-D indoor simultaneous localization and mapping (SLAM). First, due to the high-frame rate of common RGB-D sensors, in our approach, we employ a robust key-frame selection strategy. We then consider the typical depth noise properties of the sensor and exploit both visual and geometric information to determine the camera motion of the key frames. To enforce global consistency, we employ a combination of techniques to enhance the efficiency and accuracy of loop closure detection and then factor graph optimization is used to mitigate trajectory drift; specifically, the graph edges are weighted by residual errors. To demonstrate its accuracy and robustness, our SLAM system is tested on seven sets of sequences from publicly available benchmark datasets collected with Kinect v1 and on three sets of sequences taken with a handheld structure sensor. In a direct comparison, our approach yields significantly less trajectory error than several other state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call