Abstract

The present study investigated whether, as in adults, 7-month-old infants’ sensorimotor brain areas are recruited in response to the observation of emotional facial expressions. Activity of the sensorimotor cortex, as indexed by µ rhythm suppression, was recorded using electroencephalography (EEG) while infants observed neutral, angry, and happy facial expressions either in a static (N = 19) or dynamic (N = 19) condition. Graph theory analysis was used to investigate to which extent neural activity was functionally localized in specific cortical areas. Happy facial expressions elicited greater sensorimotor activation compared to angry faces in the dynamic experimental condition, while no difference was found between the three expressions in the static condition. Results also revealed that happy but not angry nor neutral expressions elicited a significant right-lateralized activation in the dynamic condition. Furthermore, dynamic emotional faces generated more efficient processing as they elicited higher global efficiency and lower networks’ diameter compared to static faces. Overall, current results suggest that, contrarily to neutral and angry faces, happy expressions elicit sensorimotor activity at 7 months and dynamic emotional faces are more efficiently processed by functional brain networks. Finally, current data provide evidence of the existence of a right-lateralized activity for the processing of happy facial expressions.

Highlights

  • The current study addresses this issue by investigating 7-month-old infants’ sensorimotor response to static and dynamic facial expressions of happiness and anger and how neural networks underlying the processing of emotional expressions may be organized at this age

  • The analysis of variance (ANOVA) yielded a significant main effect of emotion, F(2,72) = 3.57; p = 0.03, η2p = 0.09, with happy expressions (M = −0.14 μV; SD = 0.35 μV) eliciting greater sensorimotor alpha suppression compared to angry faces (M = 0.004 μV; SD = 0.35 μV), irrespectively of the experimental condition

  • Research demonstrated that 5-month-olds presented with dynamic facial expressions display an attentional bias towards fearful faces at an earlier age [77,78], and that 7-month-old infants showed a differential modulation of event-related potential responses to dynamic vs. static emotional faces [21]. Adding to this body of evidence, the current results further suggest that the perception of dynamic compared to static emotional faces augments sensorimotor activation to happy compared to angry faces

Read more

Summary

Introduction

Perception and interpretation of others’ faces play a crucial role in human communication, learning about the social and physical world, regulating our emotions, and developing relationships with others. This is especially true early in life, when infants cannot rely on language to understand others’ behaviors, but mainly observe and interpret gestures and facial expressions to grasp others’ intentions and feelings [2,3]. Considerable efforts have been devoted to elucidating the neural underpinnings of the early development of emotion processing (e.g., [6,7]), little is still known about the role of sensorimotor areas in the processing of facial expressions during infancy. The current study addresses this issue by investigating 7-month-old infants’ sensorimotor response to static and dynamic facial expressions of happiness and anger and how neural networks underlying the processing of emotional expressions may be organized at this age

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call