Abstract

Dynamic world modeling requires the integration of multiple sensor observations obtained from multiple vehicle locations at different times. A crucial problem in this interpretation task is the presence of uncertainty in the origins of measurements (data association or correspondence uncertainty) as well as in the values of measurements (noise uncertainty). Almost all previous work in robotics has not distinguished between these two very different forms of uncertainty. In this paper we propose to model the uncertainty due to noise, e.g. the error in an object's position, by conventional covariance matrices. To represent the data association uncertainty, an hypothesis tree is constructed, the branches at any node representing different possible assignments of measurements to features. A rigorous Bayesian data association framework is then introduced that allows the probability of each hypothesis to be calculated. These probabilities can be used to guide an intelligent pruning strategy. The multiple hypothesis tree allows decisions concerning the assignment of measurements to be postponed. Instead, many different hypotheses are considered. Expected observations are predicted for each hypothesis and these are compared with actual measurements. Hypotheses that have their predictions supported by measurements increase in probability compared with hypotheses whose predictions are unsupported. By “looking ahead” two or three time steps and examining the probabilities at the leaves of the tree, very accurate assignment decisions can be made. For dynamic world modeling, the approach results in multiple world models at a given time step, each one representing a possible interpretation of all past and current measurements and each having an associated probability. In addition, each geometric feature has an associated covariance that models the uncertainty due to noise. This framework is independent of the sensing modality, being applicable to most temporal data association problems. It is therefore appropriate for the broad class of vision, acoustic and range sensors currently used on existing mobile robots. Preliminary results using ultrasonic range data demonstrate the feasibility of the approach.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.