Abstract
Recently, robot services have been widely applied in many fields. To provide optimum service, it is essential to maintain good acceptance of the robot for more effective interaction with users. Previously, we attempted to implement facial expressions by synchronizing an estimated human emotion on the face of a robot. The results revealed that the robot could present different perceptions according to individual preferences. In this study, we considered individual differences to improve the acceptance of the robot by changing the robot’s expression according to the emotion of its interacting partner. The emotion was estimated using biological signals, and the robot changed its expression according to three conditions: synchronized with the estimated emotion, inversely synchronized, and a funny expression. During the experiment, the participants provided feedback regarding the robot’s expression by choosing whether they “like” or “dislike” the expression. We investigated individual differences in the acceptance of the robot expression using the Semantic Differential scale method. In addition, logistic regression was used to create a classification model by considering individual differences based on the biological data and feedback from each participant. We found that the robot expression based on inverse synchronization when the participants felt a negative emotion could result in impression differences among individuals. Then, the robot’s expression was determined based on the classification model, and the Semantic Differential scale on the impression of the robot was compared with the three conditions. Overall, we found that the participants were most accepting when the robot expression was calculated using the proposed personalized method.
Highlights
In recent years, the use of robot services has expanded in commerce and welfare facilities due to the declining birthrate and aging society [1]
We focused on improving the acceptance of the robot’s facial expression for better human–robot communication by changing the robot expression according to the emotion information of its interacting partner
Expression the personalized expression didallnot affect the shows that therobot scores from personalized were highest among expression impressioninofmost the vitality conditions items ingroup
Summary
The use of robot services has expanded in commerce and welfare facilities due to the declining birthrate and aging society [1]. The estimated emotion was used to determine the robot facial expression in order to investigate people’s acceptance of the robot based on the emotion synchronization effect. Li et al used facial expression recognition in their study of the emotional synchronization effect for human–robot communication. They compared the impressions of the robot between the synchronization and non-synchronization of the emotion, which was estimated from the participants using facial expression recognition. They found that emotional synchronization-based communication resulted in better impressions of the robot compared to non-synchronization, increasing people’s willingness to communicate with the robot in a favorable manner [13]
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.