Abstract

Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of clinical conditions such as autism or social anxiety disorders.

Highlights

  • Human faces are stimuli we are exposed to every day

  • This study aimed at examining whether facial features that are diagnostic of the current emotional expression are automatically processed irrespective of the task at hand and the position in the visual field

  • In Experiment 1, eye movements were recorded while participants accomplished an emotion classification, a gender discrimination or a passive task

Read more

Summary

Introduction

Human faces are stimuli we are exposed to every day. Human communication consists of voice messages but is disambiguated by gesture and facial expression. In line with this reasoning, emotionally expressive faces seem to be processed preferentially as compared to neutral ones [1,2]. Already in 1944, Hanawalt showed that different facial features are important to distinguish between different specific emotions [3]. He suggested the mouth to be most informative for recognizing happy faces and the eyes to be most important for detecting fearful facial expressions

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call