Abstract

The identification of emotional expressions is vital for social interaction, and can be affected by various factors, including the expressed emotion, the intensity of the expression, the sex of the face, and the gender of the observer. This study investigates how these factors affect the speed and accuracy of expression recognition, as well as dwell time on the two most significant areas of the face: the eyes and the mouth. Participants were asked to identify expressions from female and male faces displaying six expressions (anger, disgust, fear, happiness, sadness, and surprise), each with three levels of intensity (low, moderate, and normal). Overall, responses were fastest and most accurate for happy expressions, but slowest and least accurate for fearful expressions. More intense expressions were also classified most accurately. Reaction time showed a different pattern, with slowest response times recorded for expressions of moderate intensity. Overall, responses were slowest, but also most accurate, for female faces. Relative to male observers, women showed greater accuracy and speed when recognizing female expressions. Dwell time analyses revealed that attention to the eyes was about three times greater than on the mouth, with fearful eyes in particular attracting longer dwell times. The mouth region was attended to the most for fearful, angry, and disgusted expressions and least for surprise. These results extend upon previous findings to show important effects of expression, emotion intensity, and sex on expression recognition and gaze behaviour, and may have implications for understanding the ways in which emotion recognition abilities break down.

Highlights

  • Accurate identification of emotional facial expressions (EFEs) is essential for everyday social interaction

  • We show that during free viewing of EFE stimuli, accuracy rates, reaction time (RT) and eye scan paths can vary with the type and degree of emotional content on show

  • We found that fearful and happy expressions produce the most pronounced effects, with fearful expressions recognized with the least speed and accuracy, while happy expressions were recognized with the greatest speed and accuracy

Read more

Summary

Introduction

Accurate identification of emotional facial expressions (EFEs) is essential for everyday social interaction. The importance of communicating EFE information is emphasized by results showing that the processing of human EFEs is optimized [3,4], and that the processing of certain EFEs occurs even when the face is presented outside of conscious awareness [5,6]. Attentional allocation for emotional faces may be measured through the use of eye tracking techniques, with a close relationship observed between eye movements and spatial attention [8,9] Using these techniques, Eisenbarth and Alpers [10] showed that the recognition of human EFEs is dependent upon information from two main areas of interest (AOI): the eye region and the mouth region. We focused on four factors that may affect the processing and classification of EFEs: the type of expression, the intensity of the expression, the sex of the face, and the gender of the observer

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call