Abstract

Could a robot feel authentic empathy? What exactly is empathy, and why do most humans have it? We present a model which suggests that empathy is an emergent behavior with four main elements: a mirror neuron system, somatosensory cortices, an insula, and infant-directed “baby talk” or motherese. To test our hypothesis, we implemented a robot called MEI (multimodal emotional intelligence) with these functions, and allowed it to interact with human caregivers using comfort and approval motherese, the first kinds of vocalizations heard by infants at 3 and 6 months of age. The robot synchronized in real-time to the humans through voice and movement dynamics, while training statistical models associated with its low level gut feeling (“flourishing” or “distress”, based on battery or temperature). Experiments show that the post-interaction robot associates novel happy voices with physical flourishing 90 % of the time, sad voices with distress 84 % of the time. Our results also show that a robot trained with infant-directed “attention bids” can recognize adult fear voices. Importantly, this is the first emotion system to recognize adult emotional voices after training only with motherese, suggesting that this specific parental behavior may help build emotional intelligence.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.