Abstract

The aim of our work is to design bodily mood expressions of humanoid robots for interactive settings that can be recognized by users and have (positive) effects on people who interact with the robots. To this end, we develop a parameterized behavior model for humanoid robots to express mood through body language. Different settings of the parameters, which control the spatial extent and motion dynamics of a behavior, result in different behavior appearances expressing different moods. In this study, we applied the behavior model to the gestures of the imitation game performed by the NAO robot to display either a positive or a negative mood. We address the question whether robot mood displayed simultaneously with the execution of functional behaviors in a task can (a) be recognized by participants and (b) produce contagion effects. Mood contagion is an automatic mechanism that induces a congruent mood state by means of the observation of another person's emotional expression. In addition, we varied task difficulty to investigate how the task load mediates the effects. Our results show that participants are able to differentiate between positive and negative robot mood and they are able to recognize the behavioral cues (the parameters) we manipulated. Moreover, self-reported mood matches the mood expressed by the robot in the easy task condition. Additional evidence for mood contagion is provided by the fact that we were able to replicate an expected effect of negative mood on task performance: in the negative mood condition participants performed better on difficult tasks than in the positive mood condition, even though participants' self-reported mood did not match that of the robot.

Highlights

  • In human–robot interaction (HRI), expressions of a robot facilitate human understanding of the robot’s behavior, affects, rationale, and motives, and is known to increase the perception of a robot as trustworthy, reliable, and life-like [1]

  • This study aims to investigate how a social robot expresses affect through body language during task executing in the context of a dyadic human robot interaction

  • We describe the interactive game we used in our study and the integration of the behavior model into the game gestures are introduced in Sect

Read more

Summary

Introduction

In human–robot interaction (HRI), expressions of a robot facilitate human understanding of the robot’s behavior, affects (e.g., emotions and moods), rationale, and motives, and is known to increase the perception of a robot as trustworthy, reliable, and life-like [1]. Among the many ways of showing affect, such as speech, voice, facial expressions, bodily expressions, color, and lights, we are interested in bodily expressions of humanoid robots. Expressing robot affect through the body enables people to use those skills to better understand robots. A study showed that bodily expressions in addition to facial expressions improved the recognition of affect [4]. Making the robot body expressive may improve people’s understanding of robot affect. The body is a important way for humanoid robots that lack facial features to express affect nonverbally, such as the NAO, ASIMO, and QRIO

Objectives
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call