Abstract

This study investigated the neurocognitive mechanisms underlying the role of the eye and the mouth regions in the recognition of facial happiness, anger, and surprise. To this end, face stimuli were shown in three formats (whole face, upper half visible, and lower half visible) and behavioral categorization, computational modeling, and ERP (event-related potentials) measures were combined. N170 (150–180ms post-stimulus; right hemisphere) and EPN (early posterior negativity; 200–300ms; mainly, right hemisphere) were modulated by expression of whole faces, but not by separate halves. This suggests that expression encoding (N170) and emotional assessment (EPN) require holistic processing, mainly in the right hemisphere. In contrast, the mouth region of happy faces enhanced left temporo-occipital activity (150–180ms), and also the LPC (late positive complex; centro-parietal) activity (350–450ms) earlier than the angry eyes (450–600ms) or other face regions. Relatedly, computational modeling revealed that the mouth region of happy faces was also visually salient by 150ms following stimulus onset. This suggests that analytical or part-based processing of the salient smile occurs early (150–180ms) and lateralized (left), and is subsequently used as a shortcut to identify the expression of happiness (350–450ms). This would account for the happy face advantage in behavioral recognition tasks when the smile is visible.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call