Abstract

Non-humanoid robots are becoming increasingly utilized for collaborative tasks across many domains, including industrial and service settings. Collaborative tasks between the human and robot rely on each collaborator's ability to effectively convey their mental state while accurately estimating and interpreting their partner's knowledge, intent, and actions. My research focuses on nonverbal communication signals that a non-humanoid robot can utilize during human-robot collaboration. We focus on motion, light and sound as they are commonly used communication channels across many domains and are available on most robot platforms. As a first step towards this goal, I present a completed study exploring how to use a simple multimodal light and sound signal to request help during a collaborative task. We then discuss future work to generate and utilize more complex signals to convey a variety of statuses to improve collaboration.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call