Abstract

Directing groups of unmanned air vehicles (UAVs) is a task that typically requires the full attention of several operators. This can be prohibitive in situations where an operator must pay attention to their surroundings. In this paper we present a gesture device that assists operators in commanding UAVs in focus-constrained environments. The operator influences the UAVs’ behavior by using intuitive hand gesture movements. Gestures are captured using an accelerometer and gyroscope and then classified using a logistic regression model. Ten gestures were chosen to provide behaviors for a group of fixed-wing UAVs. These behaviors specified various searching, following, and tracking patterns that could be used in a dynamic environment. A novel variant of the Monte Carlo Tree Search algorithm was developed to autonomously plan the paths of the cooperating UAVs. These autonomy algorithms were executed when their corresponding gesture was recognized by the gesture device. The gesture device was trained to classify the ten gestures and accurately identified them 95% of the time. Each of the behaviors associated with the gestures was tested in hardware-in-the-loop simulations and the ability to dynamically switch between them was demonstrated. The results show that the system can be used as a natural interface to assist an operator in directing a fleet of UAVs.Article highlightsA gesture device was created that enables operators to command a group of UAVs in focus-constrained environments.Each gesture triggers high-level commands that direct a UAV group to execute complex behaviors.Software simulations and hardware-in-the-loop testing shows the device is effective in directing UAV groups.

Highlights

  • While the capabilities and applications for Unmanned Aerial Vehicles (UAVs) have expanded drastically, their effectiveness is often limited by the number of available operators and the complexity of the assigned task

  • In [6] we introduced a variant of MCTS called Coordinate Monte Carlo Tree Search (CMCTS)

  • During CMCTS, we follow the structure of MCTS by building a search tree for each UAV Tj[n] = (Nj[n], Ej[n]) at time step n, where W is the total number of UAVs and j ∈ [1, W], Nj are the tree’s nodes, and Ej the edges

Read more

Summary

Introduction

While the capabilities and applications for Unmanned Aerial Vehicles (UAVs) have expanded drastically, their effectiveness is often limited by the number of available operators and the complexity of the assigned task. Specific additions to this work include (a) adding four additional gestures and corresponding high-level UAV behaviors that an operator may command, (b) validating the algorithms with hardware-in-the-loop experiments using an in-the-field operator and a virtual UAV, (c) incorporating a non-myopic control methodology that accounts for reward beyond the UAV’s event horizon using artificial potential fields, and (d) showing through simulation experiments that the gesture commands are a viable method for directing groups of UAVs in search and track scenarios These additions will be highlighted in greater detail in their relevant sections throughout the paper.

Background
UAV‐operator interface devices
Cooperative path planning
Human‐UAV cooperation
Path planning
Selection
Expansion
Simulation
Backpropagation
Operator‐UAV gesture interface
Gesture device
Gesture classifier
Gesture commands
Results
Simulation environment
CMCTS testing
Gesture testing
Wide search
Deep search
Return to home
Return and search
Area search
Target tracking
Simulated testing
Outdoor hardware testing
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call