Emotion is a rapidly changing psychological and physical phenomenon. In daily life, people need to use the information of various sensory modalities (visual, auditory, etc.) to perceive non-verbal emotional information. Non-verbal emotion from faces and voices are often complex and varied. Previous studies have revealed that there are common and distinct neural networks underlying perception of human faces and voices. However, the neural mechanisms underlying visual and auditory emotional perception have not been well studied. Furthermore, despite researches on audiovisual integration of cross-modal emotional information, the multisensory cortex of the visual and auditory emotional information remains elusive. Therefore, it is necessary to study the similarities and differences between the neural mechanisms of emotion perception in visual and auditory modalities, and to explore the multiple sensory cortex of cross-modal emotion perception. The present fMRI (functional magnetic resonance imaging) study adopted a 2*3 (stimulus presentation modality: Visual, auditory; emotional valence: Happy, sad, fear) event-related design to investigate the neural mechanisms of emotion perception in visual and auditory modalities. When the stimulus (an emotional face or voice) was visually or aurally presented, participants were required to make a gender judgement. The results showed that the activation intensity of emotional faces in V1−V4, bilateral fusiform gyrus and bilateral superior temporal sulcus (STS) was significantly higher than that of emotional voices. Conversely, the activation intensity of emotional voices in auditory cortex (AC) was significantly higher than that of emotional faces. The results from multivoxel pattern analysis (MVPA) showed that the activation patterns of the right STS could discriminate the perception of human faces with emotional valence (happy, sad and fear face), indicating that the rSTG plays important role in perception of faces with different emotional valence; the activation patterns of the right FFA were different for happy and sad faces, indicating that the rFFA is crucial for positive and negative emotional face perception. A voxel-based whole brain analysis was further performed to examine the cortical areas that modulated perception of emotional valence. The whole brain analysis showed that the main effects for emotional valence was significant in the left opercular part of inferior frontal gyrus, indicating that this region might be a multisensory cortex of visual-auditory emotional perception. In summary, our study provided important evidence for further understanding the processing of emotion perception in different modalities and multisensory cortex of cross-channel emotion perception.
Read full abstract