Abstract

State of the art approaches to Multi-robot localization and mapping still present multiple issues to be improved, offering a wide range of possibilities for researchers and technology. This paper presents a new algorithm for visual Multi-robot simultaneous localization and mapping, used to join, in a common reference system, several trajectories of different robots that participate simultaneously in a common mission. One of the main problems in centralized configurations, where the leader can receive multiple data from the rest of robots, is the limited communications bandwidth that delays the data transmission and can be overloaded quickly, restricting the reactive actions. This paper presents a new approach to Multi-robot visual graph Simultaneous Localization and Mapping (SLAM) that aims to perform a joined topological map, which evolves in different directions according to the different trajectories of the different robots. The main contributions of this new strategy are centered on: (a) reducing to hashes of small dimensions the visual data to be exchanged among all agents, diminishing, in consequence, the data delivery time, (b) running two different phases of SLAM, intra- and inter-session, with their respective loop-closing tasks, with a trajectory joining action in between, with high flexibility in their combination, (c) simplifying the complete SLAM process, in concept and implementation, and addressing it to correct the trajectory of several robots, initially and continuously estimated by means of a visual odometer, and (d) executing the process online, in order to assure a successful accomplishment of the mission, with the planned trajectories and at the planned points. Primary results included in this paper show a promising performance of the algorithm in visual datasets obtained in different points on the coast of the Balearic Islands, either by divers or by an Autonomous Underwater Vehicle (AUV) equipped with cameras.

Highlights

  • Introduction and Related WorkSimultaneous Localization and Mapping (SLAM) [1] is an essential task for Autonomous Underwater Vehicles (AUV) to achieve successfully and precisely their programmed missions

  • This paper presents a new approach to Multi-robot visual graph-SLAM, especially designed for 2’5D configurations, where vehicles move at a constant altitude with a camera pointing downwards, with the lens axis constantly perpendicular to the ground or to the vehicle longitudinal axis

  • The indicated trajectory-based scheme implies that the trajectory of each robot is estimated by means of compounding [47] successive displacements calculated from one point to the. These successive displacements form the state of an Extended Kalman filter (EKF) which is updated using the transforms given by the confirmed loop closings. This displacement corresponds to the visual odometry calculated between consecutive images, and the images candidate to close a loop with the current image are found comparing the corresponding image hashes and confirmed by a RANSAC-based algorithm applied on a brute-force visual feature-matching process

Read more

Summary

Introduction and Related Work

Simultaneous Localization and Mapping (SLAM) [1] is an essential task for Autonomous Underwater Vehicles (AUV) to achieve successfully and precisely their programmed missions. Running the algorithm online onboard the vehicles is a must, since it is especially addressed to multi-robot configurations, and these configurations imply controlling, mapping and guiding several robots moving simultaneously, where usually, one centralizes the processing of the localization data of the whole group They are not directly novel contributions, it is worth mentioning two additional advantages in the implementation: (i) to [32] or [35], once maps of different robots are joined, standard graph-based topology representations are used, where images form nodes and transforms between two images (being from consecutive frames or between two images that close a loop) form edges or links, and (ii) the graph is optimized by means of standard g2o [43]; this standardization facilitates the exchange of the different modules on a variety of software platforms and their reuse among different implementations.

Overview
Visual Odometry
Output: 10 f ail: Boolean stating if failed to find X BA 11 X BA
Local Loop Detection and Trajectory Optimization
Inter-Session Loop Closings
Map Joining
Multi-Robot Graph SLAM
11 Variables
19 Functions
Experimental Setup
Experiments and Results
Some Considerations of the Data Reduction
Sources Availability
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call