Abstract

Environments in which Global Positioning Systems (GPS), or more generally Global Navigation Satellite System (GNSS), signals are denied or degraded pose problems for the guidance, navigation, and control of autonomous systems. This can make operating in hostile GNSS-Impaired environments, such as indoors, or in urban and natural canyons, impossible or extremely difficult. Pixel Processor Array (PPA) cameras—in conjunction with other on-board sensors—can be used to address this problem, aiding in tracking, localization, and control. In this paper we demonstrate the use of a PPA device—the SCAMP vision chip—combining perception and compute capabilities on the same device for aiding in real-time navigation and control of aerial robots. A PPA consists of an array of Processing Elements (PEs), each of which features light capture, processing, and storage capabilities. This allows various image processing tasks to be efficiently performed directly on the sensor itself. Within this paper we demonstrate visual odometry and target identification running concurrently on-board a single PPA vision chip at a combined frequency in the region of 400 Hz. Results from outdoor multirotor test flights are given along with comparisons against baseline GPS results. The SCAMP PPA's High Dynamic Range (HDR) and ability to run multiple algorithms at adaptive rates makes the sensor well suited for addressing outdoor flight of small UAS in GNSS challenging or denied environments. HDR allows operation to continue during the transition from indoor to outdoor environments, and in other situations where there are significant variations in light levels. Additionally, the PPA only needs to output specific information such as the optic flow and target position, rather than having to output entire images. This significantly reduces the bandwidth required for communication between the sensor and on-board flight computer, enabling high frame rate, low power operation.

Highlights

  • To achieve successful autonomous operation of unmanned aerial system (UAS), it is necessary for the vehicle to maintain an acceptable estimation of its position

  • This nonlinear scaling function was determined by manually scaling data that was not used in the analysis to match the Global Positioning Systems (GPS) track at different altitudes, the scaling factors at these altitudes where used to develop a scaling function that varied with height

  • A scaling factor was applied to the PX4FLOW data which was determined in the same way as each individual scaling factor for the SCAMP, this was required as the operating altitude of 5 m Above Ground Level (AGL) was outside of the PX4FLOW sensors range capability

Read more

Summary

Introduction

To achieve successful autonomous operation of UAS, it is necessary for the vehicle to maintain an acceptable estimation of its position. Sensors commonly found on-board often include a GPS, Inertial Measurement Unit (IMU), and camera. UAS Visual Odometry Using PPAs circumstances this is sufficient as the GPS localizes position, while the IMU determines orientation, angular velocities, and linear accelerations. Further there are environments where GNSS signals may be degraded such as in urban or natural canyons. In these situations additional onboard sensors must be used to navigate and maintain knowledge of current position. These can be based on a wide range of technologies such as radar and LIDARs one key strategy is the use of cameras for Visual Odometry (VO)

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call