Abstract
Allowing virtual humans to align to others’ perceived emotions is believed to enhance their cooperative and communicative social skills. In our work, emotional alignment is realized by endowing a virtual human with the ability to empathize. Recent research shows that humans empathize with each other to different degrees depending on several factors including, among others, their mood, their personality, and their social relationships. Although providing virtual humans with features like affect, personality, and the ability to build social relationships, little attention has been devoted to the role of such features as factors modulating their empathic behavior. Supported by psychological models of empathy, we propose an approach to model empathy for the virtual human EMMA—an Empathic MultiModal Agent—consisting of three processing steps: First, the Empathy Mechanism by which an empathic emotion is produced. Second, the Empathy Modulation by which the empathic emotion is modulated. Third, the Expression of Empathy by which EMMA’s multiple modalities are triggered through the modulated empathic emotion. The proposed model of empathy is illustrated in a conversational agent scenario involving the virtual humans MAX and EMMA.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.