Abstract

Decoding emotions on others’ faces is one of the most important functions of the human brain, which has been widely studied in cognitive neuroscience. However, the precise time course of facial expression categorization in the human brain is still a matter of debate. Here we used an original paradigm to measure categorical perception of facial expression changes during event-related potentials (ERPs) recording, in which a face stimulus dynamically switched either to a different expression (between-category condition) or to the same expression (within-category condition), the physical distance between the two successive faces being equal across conditions. The switch between faces generated a negative differential potential peaking at around 160 ms over occipito-temporal regions, similar in term of latency and topography to the well-known face-selective N170 component. This response was larger in the condition where the switch occurred between faces that were perceived as having different facial expressions compared to the same expression. In addition, happy expressions were categorized around 20 ms faster than fearful expressions (respectively, 135 and 156 ms). These findings provide evidence that changes of facial expressions are categorically perceived as early as 160 ms following stimulus onset over the occipito-temporal cortex.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call