Abstract

Successful communication relies on efficient use of information from multiple modalities and dimensions. Our previous research using a cross-modal priming paradigm showed that congruent visual primes facilitate faster recognition of phonetic and emotional prosodic information in speech. Event-related potential (ERP) data further revealed distinct brain mechanisms in the N400 and late positive response (LPR) components for processing linguistic and paralinguistic congruency. The current study extended the same paradigm to English-as-a-second-language (ESL) learners to examine possible interference between the two informational dimensions as a function of language experience. Participants were sixteen normal adult ESL learners. Monosyllables /bab/ and /bib/ in a happy or angry tone were used as the auditory stimuli, and pictures of the speaker articulating vowel /a/ and /i/ with a happy or angry facial expression were used as the visual primes. Compared to native English speakers, ESL learners showed significantly longer reaction time with inconsistent congruency effects in both conditions. But their behavioral accuracy data mirrored those of the native speakers, and native-like N400 and LPR components were also reliably elicited in the ESL group in both conditions. Together, these results indicate stronger cross-modal interference between phonetic and emotional prosodic information in speech for second language learners.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call