Abstract

Swarm robotics is a natural candidate for monitoring large and complex environments. With the aid of a human who is directly involved in monitoring and search using a teleoperated robot, swarms of autonomous robots can solve problems related to navigation and changing objectives. In this context, understanding the role of nonverbal interactions have the potential to enable rapid and bidirectional communication between humans and robots. Quantifying such interactions however, requires a repeatable and engaging experimental setup that can track human action and cognition. This paper describes a desktop virtual reality testbed designed specifically to quantify human actions, perception, and cognitive load in real time during a monitoring mission. Motivated by recent deployment of underwater robots to monitor invasive species, the testbed is designed to mimic an underwater environment within the Great Lakes with five species of fish whose appearance, locomotion, and behavior are modeled based on videos from the field. Brain activity and pupillometry data are recorded synchronously in real time to aid in the measurement of cognitive load of the human operator. To quantify human perception, empirical data of visual acuity from the literature is used to model virtual object recognition. The capabilities of the testbed are evaluated in terms of frame rate achieved as a function of number of fish and robots in the environment, and demonstrated through two examples highlighting possible uses in human-swarm interaction studies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call