Abstract

The first application of utilizing unique information-fusion SLAM (IF-SLAM) methods is developed for mobile robots performing simultaneous localization and mapping (SLAM) adapting to search and rescue (SAR) environments in this paper. Several fusion approaches, parallel measurements filtering, exploration trajectories fusing, and combination sensors’ measurements and mobile robots’ trajectories, are proposed. The novel integration particle filter (IPF) and optimal improved EKF (IEKF) algorithms are derived for information-fusion systems to perform SLAM task in SAR scenarios. The information-fusion architecture consists of multirobots and multisensors (MAM); multiple robots mount on-board laser range finder (LRF) sensors, localization sonars, gyro odometry, Kinect-sensor, RGB-D camera, and other proprioceptive sensors. This information-fusion SLAM (IF-SLAM) is compared with conventional methods, which indicates that fusion trajectory is more consistent with estimated trajectories and real observation trajectories. The simulations and experiments of SLAM process are conducted in both cluttered indoor environment and outdoor collapsed unstructured scenario, and experimental results validate the effectiveness of the proposed information-fusion methods in improving SLAM performances adapting to SAR scenarios.

Highlights

  • Mobile robots perform simultaneous localization and mapping (SLAM) task in search and rescue (SAR) postdisaster scenarios, while building an exploration global map and localizing themselves with this map

  • Motivating by the researches, we developed that information-fusion SLAM (IF-SLAM) of multirobots and multisensors (MAM) autonomously explores an SAR area without recourse to global positioning system (GPS)

  • This architecture consists of multirobots and multisensors including laser range finder (LRF) sensors, localization sonars, gyro odometry, Kinect-sensor, RGB-D camera, and other proprioceptive sensors mounted on board multiple robots

Read more

Summary

Introduction

Mobile robots perform simultaneous localization and mapping (SLAM) task in search and rescue (SAR) postdisaster scenarios, while building an exploration global map and localizing themselves with this map. The algorithms of integration particle filter (IPF) and optimal improved extended Kalman filter (IEKF) combine traditional methods of consistent Rao-Blackwellized (R-B) particle filters embedded in MAM IF-SLAM architecture This architecture consists of multirobots and multisensors including laser range finder (LRF) sensors, localization sonars, gyro odometry, Kinect-sensor, RGB-D camera, and other proprioceptive sensors mounted on board multiple robots. Multiple robots on-board multisensors SAR SLAM combining measurements and trajectories has improved SLAM performances in object identification, area coverage, and loopclosure, which is necessary for robotic search and rescue tasks. Multiple robots explore SAR postdisaster area and perform SLAM task, maintaining a graph with sensor observations stored in vertices and pose differences including uncertainty information stored in edges.

Information Integration Architecture
Information-Fusion Algorithms Adapting to SAR SLAM
SLAM Simulations with Fusion Information
The Experiments for Multi-Information-Fusion SLAM
Results and Discussion
Conclusions and Future Work
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call