Abstract

Simultaneous localization and mapping (SLAM) is a well-known problem in the field of autonomous mobile robotics where a robot needs to localize itself in unknown environments by processing onboard sensors without external referencing systems such as Global Positioning System (GPS). In this work, we present a visual featurebased SLAM, which is able to produce high-quality tridimensional maps in real time with a low-cost RGB-D camera such as the Microsoft Kinect. It is suitable for future planning or common robot navigation tasks. First, a comprehensive performance evaluation of the combination of different state-of-the-art feature detectors and descriptors is presented. The main purpose behind these evaluations is to determine the best detectordescriptor algorithm combination to use for robot navigation. Second, we use the Iterative Closest Point (ICP) algorithm to get the relative motion between consecutive frames and then refine the pose estimate following the composition rule. However, the spatial distribution and resolution of depth data affect the performance of 3D scene reconstruction based on ICP. Due to this, we propose an adaptive architecture which computes the pose estimate from the most reliable measurements in a given environment. We evaluate our approach extensively on common available benchmark datasets. The experimental results demonstrate that our system can robustly deal with challenging scenarios while being fast enough for online applications.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.