Abstract

Given finite attentional resources, how emotional aspects of stimuli are processed automatically is controversial. Present study examined the time-course for automatic processing of facial expression by assessing N170, and late positive potentials (LPPs) of event-related potentials (ERPs) using a modified rapid serial visual presentation (RSVP) paradigm. Observers were required to confirm a certain house image and to detect whether a face image was presented at the end of a series of pictures. There were no significant main effects on emotional type for P1 amplitudes, whereas happy and fearful expressions elicited larger N170 amplitudes than neutral expressions. Significantly different LPP amplitudes were elicited depending on the type of emotional facial expressions (fear > happy > neutral). These results indicated that threatening priority was absent but discrimination of expressive vs. neutral faces occurred in implicit emotional tasks, at approximately 250 ms post-stimulus. Moreover, the three types of expressions were discriminated during the later stages of processing. Encoding emotional information of faces can be automated to a relatively higher degree, when attentional resources are mostly allocated to superficial analyzing.

Highlights

  • Facial expression processing occurs during many situations, and people are both highly efficient and fast at identifying emotional information from others’ expressions [1]

  • We picked out the face pictures from the native Chinese Facial Affective Picture System (CFAPS) to generate emotional stimuli, with 18 different images of normal faces (6 happy, 6 neutral and 6 fearful, evenly divided between male and female) as targets and 12 of inverted neutral counterparts as distractions

  • We tested the automaticity of facial expression processing when no intentional categorization task was required

Read more

Summary

Introduction

Facial expression processing occurs during many situations, and people are both highly efficient and fast at identifying emotional information from others’ expressions [1]. This processing occurs when emotion-related content is not necessary. Emotional stimuli, compared with neutral stimuli, elicit greater visual cortex activation during passive viewing [7, 8]. This activation is associated with “emotional attention” which is defined as a predisposition to spontaneously collect all processing resources for emotional information [9, 10]

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.