Abstract

The main contribution of this paper is to show the feasibility to use the novel Xtion Pro Live RGBD camera into the field of sensor data fusion and map making based on the well established Bayesian method. This approach involves the combination of the Xtion Pro Live RGBD camera with the Hokuyo laser sensor data readings, which are interpreted by a probabilistic heuristic model that abstracts the beam into a ray casting to an occupied grid cell. Occupancy grid is proposed for representing the probability of the occupied and empty areas. In order to update the occupancy grid, the Bayesian estimation method is applied to both sensor data arrays. The sensor data fusion yields a significant improvement of the combined occupancy grid compared to the individual occupied sensor data readings. It is also shown by the Mahalanobis distance that by integrating both sensors, more reliable and accurate maps are produced. The approach has been exemplified by following a sensor data fusion method to building a map of an indoor environment robot.

Highlights

  • Mobile robot’s full autonomy widens their range of applicability

  • Map building, path planning, obstacle avoidance, and control are crucial in achieving full autonomy

  • Map building shall be carried out based on fusion of sensory information and uncertainty is one of the problems when dealing with sensor data readings

Read more

Summary

Introduction

Mobile robot’s full autonomy widens their range of applicability. In this context, it is essential that a mobile robot is able to construct its own map based on sensors data readings, which are the only means the robot has to interact with its surroundings. Map building shall be carried out based on fusion of sensory information and uncertainty is one of the problems when dealing with sensor data readings. Previous research on map building based on sensor fusion between laser range finder and cameras has been carried out. A sensor data fusion approach has been followed by combining the information of a laser scanner (2D) with the RGBD image (3D) for mapping purposes. The research done in this paper has shown the feasibility to use the novel RGBD camera into the field of sensor data fusion and map making based on the well known Bayesian method. The previous probability values are inserted into the Bayes’ rule to obtain the occupied fused probability Pfo of the cell Ci,j in the resulting grid

Experiments
Findings
Conclusion and Future Research
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call