As social signals, identical facial expressions can be perceived differently, even oppositely, depending on the circumstances. Fast and accurate understanding of the information conveyed by others’ facial expressions is crucial for successful social interaction. In the current study, we used electroencephalographic analysis of several event-related potentials (ERPs) to investigate how the brain processes the facial expressions of others when they indicate different self-outcomes. In half of the trial blocks, a happy face indicated “Win” and an angry face indicated “Lose.” In the other half of the blocks, the rule was reversed. The results showed that the N170 could distinguish expression valence and the N300 could distinguish outcome valence. The valence of the expression (happy or angry) and the valence of the outcome (Win or Loss) interacted with each other in the early, automatic perceptual processing stage (N1) as well as in the later, cognitive evaluation stage (P300). Standardized Low-Resolution Electromagnetic Tomography (sLORETA) results indicated that the N1 modulation only occurred for happy faces, which may relate to automatic emotion regulation, while the interaction on P300 was significant only for angry faces, which might be associated with the regulation of negative emotions.
Read full abstract