Abstract

ObjectiveBrain rhythms of both hemispheres are involved in the processing of emotional stimuli but their interdependence between the two hemispheres is poorly known. Here we tested the hypothesis that passive visual perception of facial emotional expressions is related to a coordination of the two hemispheres as revealed by the inter-hemispherical functional coupling of brain electroencephalographic (EEG) rhythms. MethodsTo this aim, EEG data were recorded in 14 subjects observing emotional faces with neutral, happy or sad facial expressions (about 33% for each class). The EEG data were analyzed by directed transfer function (DTF), which estimates directional functional coupling of EEG rhythms. The EEG rhythms of interest were theta (about 4–6Hz), alpha 1 (about 6–8Hz), alpha 2 (about 8–10Hz), alpha 3 (about 10–12Hz), beta 1 (13–20Hz), beta 2 (21–30Hz), and gamma (31–44Hz). ResultsIn the frontal regions, inter-hemispherical DTF values were bidirectionally higher in amplitude across all frequency bands, during the perception of faces with sad compared to neutral or happy expressions. ConclusionsThese results suggest that the processing of emotional negative facial expressions is related to an enhancement of a reciprocal inter-hemispherical flux of information in frontal cortex, possibly optimizing executive functions and motor control. SignificanceDichotomical view of hemispherical functional specializations does not take into account remarkable reciprocal interactions between frontal areas of the two hemispheres during the processing of negative facial expressions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call