Abstract

In a GPS-denied environment, even the combination of GPS and Inertial Navigation System (INS) cannot provide location reliably and accurately. We propose a new denoised stereo Visual Odometry VO/INS/GPS integration system for autonomous navigation based on tightly coupled fusion. The presented navigation system can estimate the location of the vehicle in either GPS-denied or low-texture environments. Because of the random walk characteristics of the drift error of the inertial measurement units (IMU), the errors of the states grow with time. To correct these growing errors, a continuous update of observations is necessary. For this purpose, the system state vector is augmented with the extracted features from a stereo camera. Consequently, we utilize the measurements of extracted features from consecutive frames and GPS-derived information to make these updates. Moreover, we apply the discrete wavelet transform (DWT) technique before data fusion to improve the signal-to-noise ratio (SNR) of the inertial sensor measurements and attenuate high-frequency noises while conserving significant information like vehicle motion. To verify the performance of the proposed method, we utilize four flight benchmark datasets with top speeds of 5 m/s, 10 m/s, 15 m/s, and 17.5 m/s, respectively, collected over an airport runway by a quad rotor. The results demonstrate that the proposed VO/INS/GPS navigation system has a superior performance and is more stable than the VO/INS and GPS/INS methods in either GPS-denied or low-texture environments; it outperforms them by approximately 66% and 54%, respectively.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.