Abstract

We present the conceptual formulation, design, fabrication, control and commercial translation of an IoT enabled social robot as mapped through validation of human emotional response to its affective interactions. The robot design centres on a humanoid hybrid-face that integrates a rigid faceplate with a digital display to simplify conveyance of complex facial movements while providing the impression of three-dimensional depth. We map the emotions of the robot to specific facial feature parameters, characterise recognisability of archetypical facial expressions, and introduce pupil dilation as an additional degree of freedom for emotion conveyance. Human interaction experiments demonstrate the ability to effectively convey emotion from the hybrid-robot face to humans. Conveyance is quantified by studying neurophysiological electroencephalography (EEG) response to perceived emotional information as well as through qualitative interviews. Results demonstrate core hybrid-face robotic expressions can be discriminated by humans (80%+ recognition) and invoke face-sensitive neurophysiological event-related potentials such as N170 and Vertex Positive Potentials in EEG. The hybrid-face robot concept has been modified, implemented, and released by Emotix Inc in the commercial IoT robotic platform Miko (‘My Companion’), an affective robot currently in use for human-robot interaction with children. We demonstrate that human EEG responses to Miko emotions are comparative to that of the hybrid-face robot validating design modifications implemented for large scale distribution. Finally, interviews show above 90% expression recognition rates in our commercial robot. We conclude that simplified hybrid-face abstraction conveys emotions effectively and enhances human-robot interaction.

Highlights

  • Affective social robots are gaining increasing interest in research and social applications

  • We introduce a complete hybrid-face affective robotic system to convey human-like facial emotions without the complexity of full facial actuation and demonstrate its translation to commercial Internet of Things (IoT) robot (‘Miko’ – ‘my companion’) that is in use today for affective robot interaction with children

  • During these gradual transitions of pupil dilation from minimum to maximum for each expression, participants were asked to report three pupil dilation values: when they felt that the pupil dilation began to suit the face, when the pupil dilation matches the presented expression and when pupil dilation begins to mismatch with the facial expression

Read more

Summary

Introduction

Affective social robots are gaining increasing interest in research and social applications. Humanoid social robots provide means to investigate social cognition, engage with, and support human mental health. Humans respond better to robots that behave empathetically towards them, by recognising emotion and responding [511]. Fundamental work by Brazeal and Ishiguro [4, 12,13,14] grounded this field of affective human-robot communication (see [7, 8, 15, 16] for recent surveys). Industry translation of social robots has begun in service and hospitality sectors, though challenges in reliability and acceptance by humans remain unresolved [17]. We develop a practical approach to design simplified affective robots to convey emotions effectively

Objectives
Methods
Findings
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call