Human facial and bodily expressions play a crucial role in human–human interaction to convey the communicator’s feelings. Being echoed by the influence of human social behavior, recent studies in human–robot interaction (HRI) have investigated how to generate emotional behaviors for social robots. Emotional behaviors can enhance user engagement, allowing the user to interact with robots in a transparent manner. However, they are ambiguous and affected by many factors, such as personality traits, cultures, and environments. This article focuses on developing the robot’s emotional bodily expressions adopting the user’s affective gestures. We propose the behavior selection and transformation model, enabling the robots to incrementally learn from the user’s gestures, to select the user’s habitual behaviors, and to transform the selected behaviors into robot motions. The experimental results under several scenarios showed that the proposed incremental learning model endows a social robot with the capability of entering into a positive, long-lasting HRI. We have also confirmed that the robot can express emotions through the imitated motions of the user. The robot’s emotional gestures that reflected the interacting partner’s traits were widely accepted within the same cultural group, and perceptible across different cultural groups in different ways.
Read full abstract