Abstract

Flight testbeds with multiple unmanned aerial vehicles (UAVs) are especially important to support research on multi-vehicle-related algorithms. The existing platforms usually lack a generic and complete solution allowing for software and hardware design. For such a purpose, this paper presents the design and implementation of a comprehensive multi-camera-based testbed for 3-D tracking and control of UAVs. First, the testbed software consists of a multi-camera system and a ground control system, which performs image processing, camera calibration, 3-D reconstruction, pose estimation, and motion control. In the multi-camera system, the positions and orientations of UAVs are first reconstructed by using epipolar geometric constraints and triangulation methods and then filtered by an extended Kalman filter (EKF). In the ground control system, a classical proportional–derivative (PD) controller is designed to receive the navigation data from the multi-camera system and then generates control commands to the target vehicles. Then, the testbed hardware employs smart cameras with field-programmable gate array (FPGA) modules to allow for real-time computation at a frame rate of 100 Hz. Lightweight quadcopter Parrot Bebop drones are chosen as the target UAVs, which does not require any modification to the hardware. Artificial infrared reflective markers are asymmetrically mounted on target vehicles and observed by multiple infrared cameras located around the flight region. Finally, extensive experiments are performed to demonstrate that the proposed testbed is a comprehensive and complete platform with good scalability applicable for research on a variety of advanced guidance, navigation, and control algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call