Abstract

Human speech contains not only the linguistic content but also important information about speaker identity and affect. This study employed whole-head magnetoencephalography (MEG) to examine how brain activities were modulated by selective listening of phoneme, affect and gender information with different degrees of task difficulty. The participants were 10 male Japanese adults with normal hearing. The words were ‘right’ and ‘light’ recorded from native English speakers and binaurally presented at 50dB SL. The participants were asked to judge congruency between the visual prime and the spoken word for each trial. The experiment started with a familiarization phase, which was immediately followed by the test phase with 200 trials in each condition. Behavioral results confirmed an increasing order of difficulty from gender to affect to phoneme conditions. Significant priming effects were found only for the affect and gender conditions. In line with the behavioral results, the MEG data revealed distinct patterns of hemispheric and regional involvement and neural oscillatory activities for evaluating the cross-modal congruency in the three conditions. These results demonstrate the neural dynamics and complexity in processing linguistic and paralinguistic information in spoken words with differential influences of language experience.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call