To address problems in the integrated navigation error law of unmanned aerial vehicles (UAVs), this paper proposes a method for measuring the error rule in visual inertial odometry based on scene matching corrections. The method involves several steps to build the solution. Firstly, separate models were constructed for the visual navigation model, the Micro-Electromechanical System (MEMS) navigation model, and the scene matching correction model. Secondly, an integrated navigation error measurement model based on scene matching corrections and MEMS navigation was established (the MEMS+SM model). Finally, an integrated navigation error measurement model based on scene matching corrections, visual navigation, and MEMS navigation was constructed (the VN+MEMS+SM model). In the experimental part, this paper first calculates the average error of the VN+MEMS+SM model and the MEMS+SM model under different scene matching accuracies, scene matching times, and MEMS accuracies. The results indicate that, when the scene matching accuracy is less than 10 m and the scene matching time is less than 10 s, the errors of the VN+MEMS+SM model and the MEMS+SM model are approximately equal. Furthermore, the relationship between the scene matching time and the scene matching accuracy in the EMS+SM model was calculated. The results show that, when the scene matching time is 10 s, the critical values of the image matching accuracies required to achieve average errors of 10 m, 30 m, and 50 m are approximately 160 m, 240 m, and 310 m. Additionally, when the MEMS accuracy is 150, the scene matching accuracy is 50 m, and the scene matching time exceeds 135 s, the average error of the VN+MEMS+SM model will be smaller than that of the MEMS+SM model.
Read full abstract