Abstract

Simultaneous localization and mapping (SLAM) is an essential capability for Unmanned Ground Vehicles (UGVs) travelling in unknown environments where globally accurate position data as GPS is not available. It is an important topic in the autonomous mobile robot research. This paper presents an Adaptive De-centralized Cooperative Vision-based SLAM solution for multiple UGVs, using the Adaptive Covariance Intersection (ACI) supported by a stereo vision sensor. In recent years, SLAM problem has gotten a specific consideration, the most commonly used approaches are the EKF-SLAM algorithm and the FAST-SLAM algorithm. The primary, which requires an accurate process and an observation model, suffers from the linearization problem. The last mentioned is not suitable for real-time implementation. In our work, the Visual SLAM (VSLAM) problem could be solved based on the Smooth Variable Structure Filter (SVSF) is proposed. This new filter is robust and stable to modelling uncertainties making it suitable for UGV localization and mapping problem. This new strategy retains the near optimal performance of the SVSF when applied to an uncertain system, it has the added benefit of presenting a considerable improvement in the robustness of the estimation process. All UGVs will add data features sorted by the ACI that estimate position on the global map. This solution gives, as a result, a large reliable map constructed by a group of UGVs plotted on it. This paper presents a Cooperative SVSF-VSLAM algorithm that contributes to solve the Adaptive Cooperative Vision SLAM problem for multiple UGVs. The algorithm was implemented on three mobile robots Pioneer 3-AT, using stereo vision sensors. Simulation results show eciency and give an advantage to our proposed algorithm, compared to the Cooperative EKF-VSLAM algorithm mainly concerning the noise quality. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Highlights

  • In the last few years, the simultaneous localization and mapping (SLAM) became an important topic of research in the robotics community

  • SLAM is the process that enables a mobile robot to localize and build a map of an unknown environment using only observations relative to the most relevant features detected by its sensors

  • The EKF-SLAM algorithm only works with feature maps

Read more

Summary

Introduction

In the last few years, the simultaneous localization and mapping (SLAM) became an important topic of research in the robotics community. The beauty of the Kalman Filter (KF) comes from the fact that they estimate a fully correlated posterior over feature maps and robot poses. The EKF approximates the SLAM posterior as a high-dimensional Gaussian overall feature in the map and the robot pose. The single hypothesis and quadratic complexity due to the high dimensional Gaussian approximations for states of the robot and landmarks locations makes the o-diagonal elements of the covariance matrix very large. This causes more complexity and cost increase of computation and, in most cases, diverges the lter [10]. Quadratic complexity is a consequence of the Gaussian representation employed by the EKF

Objectives
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call