Abstract

Nowadays, unmanned aerial vehicles (UAVs) have evolved into an alternative to traditional mapping platforms for some applications due to their flexibility and cost savings. Most UAVs, which are used as mobile mapping systems (MMS), depend on utilizing position and orientation system (POS) onboard the platform. Usually, POS consists of Global Navigation Satellite System (GNSS) as a positioning sensor integrated with an Inertial Navigation System (INS), which encompasses Inertial Measurement Unit (IMU) as an orientation sensor. This GNSS/INS integration is usually performed in either a loosely coupled (LC) or a tightly coupled (TC) scheme. Although LC is a simple scheme that uses the GNSS solutions to aid the INS navigation, TC has the advantage of being able to integrate raw GNSS measurements and INS when less than four GNSS satellites are tracked, which makes it a better candidate for more MMS applications. However, for some environments with large GNSS outages, especially where no satellites are tracked, TC architecture is still not a convenient solution, even with using smoothers afterward. In this research, a low-cost UAV MMS is developed combining an inexpensive POS, a camera, and a spinning multi-beam LiDAR for different mapping applications. In addition, a processing strategy is proposed to refine the trajectory during GNSS outages. This strategy comprises applying a two-stage Kalman filter along with smoothers. The first stage is a TC scheme KF followed by a smoother using GNSS and INS measurements as inputs. The second stage is a LC scheme KF utilizing the output from the first stage and a vision-based trajectory to aid trajectory refinement during GNSS outages. The vision-based trajectory is derived from integrated sensor orientation (ISO) based on using a bundle adjustment (BA) procedure for a collected block of images for the area of interest, without using any ground control points. This technique enhances the accuracy of the estimated position by 50% in the planimetric direction and 25% in the vertical direction during GNSS outages. In addition, the estimated heading accuracy of the platform is enhanced by 25%. The derived vision-aided GNSS/INS trajectory is used to derive more accurate georeferenced information from other sensors, such as LiDAR, onboard mapping platform. Finally, quantitative and quantitative quality control are conducted to evaluate the vision-aided GNSS/INS trajectory derived from the proposed processing strategy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call