Abstract

Rapid and accurate processing of potential social threats is paramount to social thriving, and provides a clear evolutionary advantage. Though automatic processing of facial expressions has been assumed for some time, some researchers now question the extent to which this is the case. Here, we provide electrophysiological data from a psychological refractory period (PRP) dual-task paradigm in which participants had to decide whether a target face exhibited a neutral or fearful expression, as overlap with a concurrent auditory tone categorization task was experimentally manipulated. Specifically, we focused on four event-related potentials (ERP) linked to emotional face processing, covering distinct processing stages and topography: the early posterior negativity (EPN), early frontal positivity (EFP), late positive potential (LPP), and also the face-sensitive N170. As expected, there was an emotion modulation of each ERP. Most importantly, there was a significant attenuation of this emotional response proportional to the degree of task overlap for each component, except the N170. In fact, when the central overlap was greatest, this emotion-specific amplitude was statistically null for the EFP and LPP, and only marginally different from zero for the EPN. N170 emotion modulation was, on the other hand, unaffected by central overlap. Thus, our results show that emotion-specific ERPs for three out of four processing stages—i.e., perceptual encoding (EPN), emotion detection (EFP), or content evaluation (LPP)—are attenuated and even eliminated by central resource scarcity. Models assuming automatic processing should be revised to account for these results.

Highlights

  • Facial expressions of emotions are a powerful non-verbal social tool for externalizing internal states and making these salient to other individuals

  • Electrophysiological data extracted from the Fearful–Neutral difference early frontal positivity (EFP), late positive potential (LPP), and early posterior negativity (EPN) waveforms were each analyzed with a 3 (SOA) × 3 [Electrode: FC1, FCz, FC2 (EFP, LPP); or O1, Oz, O2 (EPN)] repeated measures analysis of variance (ANOVA)

  • Though one study is hardly enough to contradict years of empirical evidence, our results from three different event-related potentials (ERP) directly linked to a specific facial emotion processing stages make a significant contribution to a growing body of literature inconsistent with this account, suggesting that this model should at the very least be tempered

Read more

Summary

Introduction

Facial expressions of emotions are a powerful non-verbal social tool for externalizing internal states and making these salient to other individuals. Rapid and effortless (i.e., automatic) processing of potential social threats—even ones that lie outside of attention—would provide a clear evolutionary advantage. This view has prevailed for some time (e.g., Palermo and Rhodes, 2007), several researchers question the extent to which the processing of facial emotions is automatic. Using a variant of this paradigm, Vuilleumier et al (2001) found evidence of increased left amygdala activity in response to fearful (vs neutral) facial expressions presented at both attended and unattended locations, consistent with the automatic processing account

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call