Abstract

Sentence prosody is long known to serve both linguistic functions (e.g. to differentiate between questions and statements) and emotional functions (e.g. to detect the emotional state of a speaker). These different functions of prosodic information need to be encoded rapidly during sentence comprehension to ensure successful speech communication. However, systematic investigations of the comparative nature of these two functions, i.e. are the two functions of prosody independent or interdependent, are sparse. The question at hand is whether the two prosodic functions engage a similar neural network and run a similar time-course or not. To this aim we investigated whether emotional and linguistic prosody are processed independently or dependently in an event-related brain potential (ERP) experiment. We merged a prosodically neutral head of a sentence to a second half of a sentence that differed in emotional and/or linguistic prosody. In a within-subjects design, two tasks were administered: in the “emotion task”, participants judged whether the sentence that they had just heard was spoken in a neutral tone of voice or not (emotional task); in the “linguistic task”, participants decided whether the sentence was a declarative sentence or not. As predicted, the previously reported prosodic expectancy positivity (PEP) was elicited by linguistic and emotional prosodic expectancy violations. However, the latency and distribution of the ERP component differed: whilst responses to emotional prosodic expectancy violations were elicited shortly after an expectancy violation (∼470ms post splicing-point) and most prominently at posterior electrode-sites, the positivity in response to linguistic prosody had a later onset (∼620ms post splicing-point) with a more frontal distribution. Interestingly, responses to combined (linguistic and emotional) expectancy violations resulted in a broadly distributed positivity with an onset of ∼170ms post expectancy violation. These effects were found irrespective of the task setting. Given the differences in latency and distribution, we conclude that the processing of emotional and linguistic prosody relies at least partly on differing neural mechanisms and that emotional prosodic aspects of language are processed in a prioritized processing stream.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call