Abstract

In this article, we propose a visual-inertial navigation system that directly minimizes a photometric error without an explicit data-association. We focus on the photometric error parametrized by pose and structure parameters that is highly nonconvex due to the nonlinearity of image intensity. The key idea is to introduce an <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">optimal intensity gradient</i> that accounts for a projective uncertainty of a pixel. Ensembles sampled from the state uncertainty contribute to the proposed gradient and yield a correct update direction even in a bad initialization point. We present two sets of experiments to demonstrate the strengths of our framework. First, a thorough Monte Carlo simulation in a virtual trajectory is designed to reveal robustness to large initial uncertainty. Second, we show that the proposed framework can achieve superior estimation accuracy with efficient computation time over state-of-the-art visual-inertial fusion methods in a real-world UAV flight test, where most scenes are composed of a featureless floor.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.