Abstract

Egomotion estimation, e.g. for robot navigation or augmented reality applications, requires the fusion of non-linear sampled-data system with different sensors. An example is to fuse the complimentary characteristics of visual and inertial sensors. Existing approaches either use Kalman filters in conventionally sampled systems or use Particle filters to accommodate the uncertainty of motion models. This paper introduces an approach that models multi-rate non-linear systems to exploit the characteristics of both sensors, assuming synchronicity and periodicity of measurements. The final contribution of this paper is an in-depth analysis and performance comparison of the Extended Kalman filter, the Unscented Kalman filter and three particle filters (Bootstrap, Extended and Unscented). While there is large debate over the pros and cons of these two approaches, this work shows the following results for fusing visual and inertial data in 6 DOF (position and orientation) in a tracking application: the Bootstrap Particle filter gives higher estimation error than Extended and Unscented Particle filters, which give very similar results than Extended and Unscented Kalman filters, but with considerable higher computational burden.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.