Abstract

In communication, language can be interpreted differently depending upon the emotional context. To clarify the effect of emotional context on language processing, we performed experiments using a cross-modal priming paradigm with an auditorily presented prime and a visually presented target. The primes were the names of people that were spoken with a happy, sad, or neutral intonation; the targets were interrogative one-word sentences with emotionally neutral content. Using magnetoencephalography, we measured neural activities during silent reading of the targets presented in a happy, sad, or neutral context. We identified two conditional differences: the happy and sad conditions produced less activity than the neutral condition in the right posterior inferior and middle frontal cortices in the latency window from 300 to 400 ms; the happy and neutral conditions produced greater activity than the sad condition in the left posterior inferior frontal cortex in the latency window from 400 to 500 ms. These results suggest that the use of emotional context stored in the right frontal cortex starts at ∼300 ms, that integration of linguistic information with emotional context starts at ∼400 ms in the left frontal cortex, and that language comprehension dependent on emotional context is achieved by ∼500 ms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call