Abstract

Unmanned Aerial Vehicles (UAVs) are increasingly integrated into diverse human interaction domains that require robust human-robot communication systems. Visual communication techniques have shown promise in their ability to communicate concrete information to observers. Such techniques, often described as a UAV ‘gesture,’ may be especially useful in the domain of unmanned aerial flight as they can be integrated as a stand-alone software solution in contrast to light or sound-based systems that often require additional hardware and add weight to a vehicle. Gestures may also be useful in contexts where long distance operation reduces the effectiveness of sound-based communication strategies. As gesture is a visual communication technique, it is critical that gestures are designed to optimize an observer’s ability to visually perceive the shape of a gesture’s motion. Factors such as low visual differentiability between gestures within a set may reduce an observer’s ability to classify the shape of a gestural motion. In this letter, we discuss the results from multiple gesture perception surveys. We also develop and evaluate techniques to predict, in advance, how participants may perceive a UAV gesture. We demonstrate that participant gesture classification accuracy correlates to trajectory distance measures and present a method for evaluating high-differentiabilty gesture sets. This letter will enable gesture designers to create gesture sets that are differentiable with high-confidence.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call