Abstract

Accurate motion estimation plays a crucial role in state estimation of an unmanned aerial vehicle (UAV). This is usually carried out by fusing the kinematics of an inertial measurement unit (IMU) with the video output of a camera. However, the accuracy of existing approaches is hindered by the discretization effect of the model even at a high IMU sampling rate. In order to improve the accuracy, we propose a new IMU motion integration model for the IMU kinematics in continuous time. The kinematics are modeled using a switched linear system. A closed-form discrete formulation is derived to compute the mean measurement, the covariance matrix, and the Jacobian matrix. Thus, it is more accurate and more efficient for online estimation of visual-inertial odometry (VIO), particularly when there is a high dynamic change in the agent's motion or the agent travels with high speed. The proposed IMU factor framework is evaluated using both real public datasets and indoor environment under different scenarios of motion capture. Our evaluation shows that the proposed framework outperforms the state-of-the-art VIO approach by up to 22.71% accuracy improvement on the EuRoc dataset and 38.15% accuracy improvement for motion estimation under the indoor environment.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call