Abstract

As robots become more ubiquitous, they will increasingly need to behave as our team partners and smoothly adapt to the (adaptive) human team behaviors to establish successful patterns of collaboration over time. A substantial amount of adaptations present themselves through subtle and unconscious interactions, which are difficult to observe. Our research aims to bring about awareness of co-adaptation that enables team learning. This paper presents an experimental paradigm that uses a physical human-robot collaborative task environment to explore emergent human-robot co-adaptions and derive the interaction patterns (i.e., the targeted awareness of co-adaptation). The paradigm provides a tangible human-robot interaction (i.e., a leash) that facilitates the expression of unconscious adaptations, such as “leading” (e.g., pulling the leash) and “following” (e.g., letting go of the leash) in a search-and-navigation task. The task was executed by 18 participants, after which we systematically annotated videos of their behavior. We discovered that their interactions could be described by four types of adaptive interactions: stable situations, sudden adaptations, gradual adaptations and active negotiations. From these types of interactions we have created a language of interaction patterns that can be used to describe tacit co-adaptation in human-robot collaborative contexts. This language can be used to enable communication between collaborating humans and robots in future studies, to let them share what they learned and support them in becoming aware of their implicit adaptations.

Highlights

  • With AI being increasingly used in social robotics (Breazeal et al, 2016), there is a growing number of possible applications in which artificially intelligent robots need to interact and collaborate with humans in the physical space

  • We use the term co-adaptation instead of team adaptation, as we study the adaptive interactions at the level of the individual actors: team adaptation is a result of adaptative behavior exhibited by the individual team members

  • We will describe exactly what interactions were extracted from the video data, how they were categorized and generalized into interaction patterns and how they can be combined into larger sequences

Read more

Summary

Introduction

With AI being increasingly used in social robotics (Breazeal et al, 2016), there is a growing number of possible applications in which artificially intelligent robots need to interact and collaborate with humans in the physical space. Creating AI for the physical world comes with many challenges, one of which is ensuring that a robot does execute its own task, but instead behaves as a team partner, to enable human and robot to become one well-functioning unit of collaboration. Humans have the ability to intuitively interpret body language of their team members and to send signals when initiating adaptations (Sacheli et al, 2013). This kind of non-verbal interaction is not obvious when a team member is a robot. While we migh1t be able to interact with a robot using language, collaborative interactions are generally multimodal and contain many subtle and implicit non-verbal interaction cues that help us to create tacit knowledge. The focus of this paper is on these non-verbal interactions, and those that are connected to physical contact

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call