Abstract
Understanding how people perceive robot gestures will aid the design of robots capable of social interaction with humans. We examined the generation and perception of a restricted form of gesture in a robot capable of simple head and arm movement, referring to point-light animation and video experiments in human motion to derive our hypotheses. Four studies were conducted to look at the effects of situational context, gesture complexity, emotional valence and author expertise. In Study 1, four participants created gestures with corresponding emotions based on 12 scenarios provided. The resulting gestures were judged by 12 participants in a second study. Participants’ recognition of emotion was better than chance and improved when situational context was provided. Ratings of lifelikeness were found to be related to the number of arm movements (but not head movements) in a gesture. In Study 3, five novices and five puppeteers created gestures conveying Ekman’s six basic emotions which were shown to 12 Study 4 participants. Puppetry experience improved identification rates only for the emotions of fear and disgust, possibly because of limitations with the robot’s movement. The results demonstrate the communication of emotion by a social robot capable of only simple head and arm movement.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.