Abstract

Processing facial expressions is an essential component of social interaction, especially for preverbal infants. In human adults and monkeys, this process involves the motor system, with a neural matching mechanism believed to couple self- and other-generated facial gestures. Here, we used electroencephalography to demonstrate recruitment of the human motor system during observation and execution of facial expressions in nine-month-old infants, implicating this system in facial expression processing from a very young age. Notably, examination of early video-recorded mother-infant interactions supported the common, but as yet untested, hypothesis that maternal mirroring of infant facial gestures is central to the development of a neural matching mechanism for these gestures. Specifically, the extent to which mothers mirrored infant facial expressions at two months postpartum predicted infant motor system activity during observation of the same expressions at nine months. This suggests that maternal mirroring strengthens mappings between visual and motor representations of facial gestures, which increases infant neural sensitivity to particularly relevant cues in the early social environment.

Highlights

  • Accurate identification and analysis of facial expressions is critical for understanding others’ internal states[1], and for regulating social relationships

  • An extensive body of research with adults and nonhuman primates suggests that sensorimotor brain regions, including parietal and premotor cortices, could support facial expression processing, but whether this is the case in human infants has not been investigated

  • Our study aimed to address two important and outstanding questions concerning a facial action-perception network: i) is a mechanism coupling own and other facial expressions present in the human infant; and ii) if so, how does it develop? To answer the first question, we used electroencephalography (EEG) to measure event related desynchronization (ERD) in the mu frequency band during observation/execution of facial expressions, in a group of nine-month-old infants

Read more

Summary

Introduction

Accurate identification and analysis of facial expressions is critical for understanding others’ internal states[1], and for regulating social relationships. An extensive body of research with adults and nonhuman primates suggests that sensorimotor brain regions, including parietal and premotor cortices, could support facial expression processing (e.g., refs22–24), but whether this is the case in human infants has not been investigated Recruitment of these parietal-premotor regions while observing others’ actions is widely thought to implement a mapping from the visual representation of an action to its corresponding motor representation[25]. This ‘mirror’ or ‘action-perception matching’ mechanism is believed to play a key role in the visual processing of others’ behaviour and in regulating social interactions[26,27]. Through maternal imitation (or ‘mirroring’), infants could observe the visual consequences of their own facial movements, providing the sensorimotor experience necessary to strengthen a link between motor and visual representations of facial gestures[30,41,45]

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call