Abstract

Social interaction requires fast and efficient processing of another person’s intentions. In face-to-face interactions, aversive or appetitive actions typically co-occur with emotional expressions, allowing an observer to anticipate action intentions. In the present study, we investigated the influence of facial emotions on the processing of action intentions. Thirty-two participants were presented with video clips showing virtual agents displaying a facial emotion (angry vs. happy) while performing an action (punch vs. fist-bump) directed towards the observer. During each trial, video clips stopped at varying durations of the unfolding action, and participants had to recognize the presented action. Naturally, participants’ recognition accuracy improved with increasing duration of the unfolding actions. Interestingly, while facial emotions did not influence accuracy, there was a significant influence on participants’ action judgements. Participants were more likely to judge a presented action as a punch when agents showed an angry compared to a happy facial emotion. This effect was more pronounced in short video clips, showing only the beginning of an unfolding action, than in long video clips, showing near-complete actions. These results suggest that facial emotions influence anticipatory processing of action intentions allowing for fast and adaptive responses in social interactions.

Highlights

  • Social interaction requires the understanding of action intentions

  • The present findings show that facial emotions bias the processing of action intentions

  • The emotional bias effect was most prominent at short video durations, when participants were not able to discriminate correctly between actions

Read more

Summary

Introduction

Social interaction requires the understanding of action intentions. In face-to-face situations, it is crucial to capture another person’s intentions as fast and accurately as possible in order to generate an adaptive response. Recognizing action intentions is necessary to ensure that interactive behavior is coordinated in space and time [1]. This task is mastered effortlessly in our everyday lives, for instance, every time when we have to reciprocate a greeting of another person or when we have to reach for an object that is offered to us [2]. According to motor simulation theories, observation of an action activates the corresponding motor program in the observer’s motor system and enables action recognition

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call