Abstract

Contexts of face perception are diverse. They range from the social environment to body postures, from the expresser’s gaze direction to the tone of voice. In extending the research on contexts of face perception, we investigated people’s perception of tears on a face. The act of shedding tears is often perceived as an expression of sad feelings aroused by experiencing loss, disappointment, or helplessness. Alternatively, tears may also represent the excessive intensity of any emotion, such as extreme fear during an unexpected encounter with a giant bear and extreme happiness when you win a competition. Investigating these competing interpretations of tears, we found that the addition of tears to different facial expressions made the expressions conceptually closer to sad expressions. In particular, the results of the similarity analysis showed that, after the addition of tears, patterns of ratings for anger, fear, disgust, and neutral facial expressions became more similar to those for sadness expressions. The effect of tears on the ratings of basic emotions and their patterns in facial expressions are discussed.

Highlights

  • Detecting and interpreting facial expressions of emotion are essential parts of human interaction

  • The current study used RSA and multidimensional scaling (MDS) to address whether the effect of emotional tears on facial expressions of emotion is consistent with either the sadness enhancement hypothesis or the general enhancement hypothesis

  • Multivariate analysis such as RSA and MDS can complement such shortcomings by integrating all ratings as inter-stimulus dissimilarity and distance. Through testing these two hypotheses derived from distinct theoretical backgrounds, we extended the understanding of the effect of tears on facial expressions by demonstrating that the dissimilarity space among facial expressions tends to shrink toward sad expressions

Read more

Summary

Introduction

Detecting and interpreting facial expressions of emotion are essential parts of human interaction. Recent studies investigating the role of context have demonstrated the integration of information from contexts and facial expressions in emotion attribution. This integration of contextual cues occurs automatically, to the extent that it overpowers the emotional cues from facial expressions (Aviezer et al, 2011). These findings showed a significant impact of contexts in the perception of facial expressions and challenged the notion of universal, basic facial expressions (Wieser and Brosch, 2012; Aviezer et al, 2017)

Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.