Abstract

This paper presents a multisensor integrated vision system and sensor fusion algorithm for the navigation of an autonomous mobile robot equipped with a laser range finder radar (LRFR) and a color CCD camera to acquire information about the environment. The 2D model of the environment is constructed and the obstacles on the road are detected by fusing knowledge included in the range of images obtained by the LRFR and the color camera. The fusion algorithm is based on the generalized Dempster-Shafer's theory of evidence (DSTE). The Dempster's rule of combination in the DSTE requires that the combined evidence should be independent of each other, but, as our research has proved, it is more reasonable to assume that the information obtained by the vision system is dependent, and under this assumption the data fusion results are more reliable in many cases. But to achieve this, generalizing the Dempster's rule of combination to the dependent conditions is necessary, so this forms the other major subject of this paper. The presented system and algorithm have been tested in real environments, and their effectiveness has been proved.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call