Abstract

This article addresses the development and implementation of a test bed for applications of heterogeneous unmanned vehicle systems. The test bed consists of unmanned aerial vehicles (Parrot AR.Drones versions 1 or 2, Parrot SA, Paris, France, and Bebop Drones 1.0 and 2.0, Parrot SA, Paris, France), ground vehicles (WowWee Rovio, WowWee Group Limited, Hong Kong, China), and the motion capture systems VICON and OptiTrack. Such test bed allows the user to choose between two different options of development environments, to perform aerial and ground vehicles applications. On the one hand, it is possible to select an environment based on the VICON system and LabVIEW (National Instruments) or robotics operating system platforms, which make use the Parrot AR.Drone software development kit or the Bebop_autonomy Driver to communicate with the unmanned vehicles. On the other hand, it is possible to employ a platform that uses the OptiTrack system and that allows users to develop their own applications, replacing AR.Drone’s original firmware with original code. We have developed four experimental setups to illustrate the use of the Parrot software development kit, the Bebop Driver (AutonomyLab, Simon Fraser University, British Columbia, Canada), and the original firmware replacement for performing a strategy that involves both ground and aerial vehicle tracking. Finally, in order to illustrate the effectiveness of the developed test bed for the implementation of advanced controllers, we present experimental results of the implementation of three consensus algorithms: static, adaptive, and neural network, in order to accomplish that a team of multiagents systems move together to track a target.

Highlights

  • Unmanned aircraft systems (UASs) require advanced features, in aerodynamic design and avionics systems, for performing different and complex tasks in places considered too dangerous for the human being

  • In order to validate the effectiveness of the developed test bed, a set of experiments were performed for both the Parrot AR.Drones as well as for the Bebop Drones

  • The target vehicle (Rovio or UAV0) has to reach four waypoints arranged in a square of 1 Â 1 m2, while the drones UAV1 and UAV2 track the target vehicle’s position and orientation at 0 and 180, respectively

Read more

Summary

Introduction

Unmanned aircraft systems (UASs) require advanced features, in aerodynamic design and avionics systems, for performing different and complex tasks in places considered too dangerous for the human being. Require: Install the AR.Drone Toolkit LVH 1: Initialize PPM and AR.Drone communications, Open and VISA VIs 2: while Stop Button 1⁄41⁄4 false do 3: Obtain the desired roll and pitch angles and the yaw and vertical speeds from the PPM decoder, from the GUI buttons or from the user control law VI 4: Send the control commands to the drone by using the Control drone VI 5: Read and display the navigation data with the Read NavData VI 6: end while 7: Close communication ports of the PPM decoder and the AR.Drone with the VISA Close and Close Vis. In order to establish a connection between LabVIEW and the VICON system, it is necessary to download and install the VICON DataStreamSDK, provided by the VICON Company (Oxford, England, United Kingdom). Require: Install the .NET 4.0 Framework Require: Install Vicon DataStreamSDK.msi

1: Use a Constructor Node like client using DS-SDK 2
Experimental results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call