Abstract
Numerous studies using the event-related potential (ERP) technique have found that emotional expressions modulate ERP components appearing at different post-stimulus onset times and are indicative of different stages of face processing. With the aim of studying the time course of integration of context and facial expression information, we investigated whether these modulations are sensitive to the situational context in which emotional expressions are perceived. Participants were asked to identify the expression of target faces that were presented immediately after reading short sentences that described happy or anger-inducing situations. The main manipulation was the congruency between the emotional content of the sentences and the target expression. Context-independent amplitude modulation of the N170 and N400 components by emotional expression was observed. On the other hand, context effects appeared on a later component (late positive potential, or LPP), with enhanced amplitudes on incongruent trials. These results show that the early stages of face processing where emotional expressions are coded are not sensitive to verbal information about the situation in which they appear. The timing of context congruency effects suggests that integration of facial expression with situational information occurs at a later stage, probably related to the detection of affective congruency.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.