Abstract

AbstractThis study proposes a music-aided framework for affective interaction of service robots with humans. The framework consists of three systems, respectively, for perception, memory, and expression on the basis of the human brain mechanism. We propose a novel approach to identify human emotions in the perception system. The conventional approaches use speech and facial expressions as representative bimodal indicators for emotion recognition. But, our approach uses the mood of music as a supplementary indicator to more correctly determine emotions along with speech and facial expressions. For multimodal emotion recognition, we propose an effective decision criterion using records of bimodal recognition results relevant to the musical mood. The memory and expression systems also utilize musical data to provide natural and affective reactions to human emotions. For evaluation of our approach, we simulated the proposed human-robot interaction with a service robot, iRobiQ. Our perception system exhibited superior performance over the conventional approach, and most human participants noted favorable reactions toward the music-aided affective interaction.

Highlights

  • Service robots operate autonomously to provide useful services for humans

  • When a participant makes speech and facial expression corresponding to a certain emotion type, iRobiQ determines the emotional state and plays several music clips relevant to the emotion, representing audiovisual expression such as eye expression, cheek color, and synthesized speech

  • Music-driven interface was reported to provide acoustically more affective and natural interaction for human. These results were obtained by a subjective test, most of the participants reported a favorable reaction toward the music-aided affective interaction

Read more

Summary

Introduction

Service robots operate autonomously to provide useful services for humans. Service robots interact with a large number of users in a variety of places from hospitals to home. An immense variety of service robots are being developed to perform human tasks such as educating children and assisting elderly people. In order to coexist in humans’ daily life and offer services in accordance with a user’s intention, service robots should be able to affectively interact and communicate with humans. Affective interaction provides robots with human-like capabilities for comprehending the emotional states of users and interacting with them . If a robot detects a negative user emotion, it might encourage or console the user by playing digital music or synthesized speech and by performing controlled movements. The primary task for affective interaction is to provide the robot with the capacity to

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call