Abstract

This work provides the fundament for a gesture-based interaction system between cargo-handling unmanned aerial vehicles (UAVs) and ground personnel. It enables novice operators to visually communicate commands with higher abstractions through a minimum number of necessary gestures. The interaction concept intends to transfer two goal-directed control techniques to a cargo-handling use case: Selecting objects via deictic pointing communicates intention and a single proxy manipulation gesture controls the UAV’s flight. A visual processing pipeline built around an RGB-D sensor is presented and its subordinate components like lightweight object detectors and human pose estimation methods are benchmarked on the UAV-Human dataset. The results provide an overview of suitable methods for 3D gesture-based human drone interaction. A first unoptimized model ensemble runs with 7 Hz on a Jetson Orin AGX Developer Kit.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call