Abstract

To effectively communicate with people, social robots must be capable of detecting, interpreting, and responding to human affect during human–robot interactions (HRIs). In order to accurately detect user affect during HRIs, affect elicitation techniques need to be developed to create and train appropriate affect detection models. In this paper, we present such a novel affect elicitation and detection method for social robots in HRIs. Non-verbal emotional behaviors of the social robot were designed to elicit user affect, which was directly measured through electroencephalography (EEG) signals. HRI experiments with both younger and older adults were conducted to evaluate our affect elicitation technique and compare the two types of affect detection models we developed and trained utilizing multilayer perceptron neural networks (NNs) and support vector machines (SVMs). The results showed that; on average, the self-reported valence and arousal were consistent with the intended elicited affect. Furthermore, it was also noted that the EEG data obtained could be used to train affect detection models with the NN models achieving higher classification rates

Highlights

  • There exists a growing number of social robots being integrated into our daily lives as they can assist and extend human capabilities in human-centered environments such as homes, hospitals, and workplaces [1]

  • Our research focuses on developing socially assistive robots to facilitate and help people, including older adults and those living with cognitive impairments, with activities of daily living such as meal assistance [23,24], dressing [25,26], exercising [7,27], and cognitive stimulating interventions including playing Bingo and trivia games [28,29,30]

  • We present the development of a novel autonomous affect elicitation and detection methodology for social robots engaging in human–robot interactions (HRIs)

Read more

Summary

Introduction

There exists a growing number of social robots being integrated into our daily lives as they can assist and extend human capabilities in human-centered environments such as homes, hospitals, and workplaces [1]. Our research focuses on developing socially assistive robots to facilitate and help people, including older adults and those living with cognitive impairments, with activities of daily living such as meal assistance [23,24], dressing [25,26], exercising [7,27], and cognitive stimulating interventions including playing Bingo and trivia games [28,29,30] During such assistive interactions, it is important for these robots to detect the intent of users and their affect with respect to the interaction, in order to determine and adapt their own assistive behaviors to each user. To the authors’ knowledge, we are the first to recognize user affect elicited from interacting with a social robot in a social HRI scenario through the use of EEG signals

Related Work on Affect Elicitation Using Robots
Coded Affect
Self-Reported Affect
Use of Both Self-Reported and Coded Affect
Methodology Using
EEG signals
Average
Self-Assessment
Affect Detection Model
Experiments
Results
Affect Detection Models
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call