Abstract

Previous research has focused on documenting the perceptual mechanisms of facial expressions of so-called basic emotions; however, little is known about eye movement in terms of recognizing crying expressions. The present study aimed to clarify the visual pattern and the role of face gender in recognizing smiling and crying expressions. Behavioral reactions and fixations duration were recorded, and proportions of fixation counts and viewing time directed at facial features (eyes, nose, and mouth area) were calculated. Results indicated that crying expressions could be processed and recognized faster than that of smiling expressions. Across these expressions, eyes and nose area received more attention than mouth area, but in smiling facial expressions, participants fixated longer on the mouth area. It seems that proportional gaze allocation at facial features was quantitatively modulated by different expressions, but overall gaze distribution was qualitatively similar across crying and smiling facial expressions. Moreover, eye movements showed visual attention was modulated by the gender of faces: Participants looked longer at female faces with smiling expressions relative to male faces. Findings are discussed around the perceptual mechanisms underlying facial expressions recognition and the interaction between gender and expression processing.

Highlights

  • Facial expressions are the main source of information when perceiving emotional states in other people (Shields et al, 2012)

  • It has been argued that faces may be processed more featurally or analytically (Tanaka & Farah, 1993). In support of this view is the fact that recognition of emotion in facial expressions is based on individual facial features (Calvo & Nummenmaa, 2008)

  • Recent studies suggest that the relationship between face gender and facial expression is more complex than simple independence (Le Gal & Bruce, 2002)

Read more

Summary

Introduction

Facial expressions are the main source of information when perceiving emotional states in other people (Shields et al, 2012). It has been argued that faces may be processed more featurally or analytically (Tanaka & Farah, 1993) In support of this view is the fact that recognition of emotion in facial expressions is based on individual facial features (Calvo & Nummenmaa, 2008). Becker et al (2007) found gender information and facial expressions are always intertwined; more precisely, participants were faster and more accurate in detecting male angry faces and female happy faces. Based on the above discussion, the current study used eye tracking to examine how people process smiling and crying faces and to explore the interaction of face gender and facial expression. The oddball task is a well-known paradigm that researchers employ to study target detection with electroencephalogram (EEG; Polich & Kok, 1995), some researchers have applied this paradigm to achieve other goals (Neta et al, 2011; Senju et al, 2003)

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call