Abstract

Successful social robot services depend on how robots can interact with users. The effective service can be obtained through smooth, engaged, and humanoid interactions in which robots react properly to a user’s affective state. This article proposes a novel Automatic Cognitive Empathy Model, ACEM, for humanoid robots to achieve longer and more engaged human-robot interactions (HRI) by considering humans’ emotions and replying to them appropriately. The proposed model continuously detects the affective states of a user based on facial expressions and generates desired, either parallel or reactive, empathic behaviors that are already adapted to the user’s personality. Users’ affective states are detected using a stacked autoencoder network that is trained and tested on the RAVDESS dataset. The overall proposed empathic model is verified throughout an experiment, where different emotions are triggered in participants and then empathic behaviors are applied based on proposed hypothesis. The results confirm the effectiveness of the proposed model in terms of related social and friendship concepts that participants perceived during interaction with the robot.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call