Abstract
This study examined the neural areas involved in the recognition of both emotional prosody and phonemic components of words expressed in spoken language using echo-planar, functional magnetic resonance imaging (fMRI). Ten right-handed males were asked to discriminate words based on either expressed emotional tone (angry, happy, sad, or neutral) or phonemic characteristics, specifically, initial consonant sound (bower, dower, power, or tower). Significant bilateral activity was observed in the detection of both emotional and verbal aspects of language when compared to baseline activity. We found that the detection of emotion compared with verbal detection resulted in significant activity in the right inferior frontal lobe. Conversely, the detection of verbal stimuli compared with the detection of emotion activated left inferior frontal lobe regions most significantly. Specific analysis of the anterior auditory cortex revealed increased right hemisphere activity during the detection of emotion compared to activity during verbal detection. These findings illustrate bilateral involvement in the detection of emotion in language while concomitantly showing significantly lateralized activity in both emotional and verbal detection, in both the temporal and frontal lobes.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.