Abstract

Unmanned Aerial Vehicles (UAVs) require an accurate estimate of their state. Computer vision provides a number of benefits over conventionally used sensors, such as the Global Positioning System (GPS) or a Motion Capture System (MCS), in order to achieve state estimation and localization relative to a scene. Our work uses the output of an existing Visual Simultaneous Localization and Mapping (VSLAM) system which provides a scaled position measurement. We propose an observer design to estimate vehicle position and linear velocity. The observer fuses an accelerometer measurement from an Inertial Measurement Unit (IMU) and VSLAM system output. The observer depends on an attitude estimate from an Attitude and Heading Reference System (AHRS). A change of coordinates is used to transform the system into a Linear Time-Varying (LTV) form. Using these coordinates we consider the observability of the Visual Inertial Simultaneous Localization and Mapping (VISLAM) problem. Two observer designs are proposed and their performance is validated in simulation and experiment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call