Abstract

Even when speakers are not actively doing another task, they can be interfered in their speech planning by concurrent auditory stimuli. In this study, we used picture naming with passive hearing, or active listening, combined to high-density electroencephalographic (EEG) recordings to investigate the locus and origin of interference on speech production. Participants named pictures while ignoring (or paying attention to) auditory syllables presented at different intervals (+150 ms, +300 ms or +450 ms). Interference of passive hearing was observed at all positive stimulus onset asynchronies (SOA) including when distractors appeared 450 ms after picture onset. Analyses of ERPs and microstates revealed modulations appearing in a time-window close to verbal response onset likely relating to post-lexical planning processes. A shift of latency of the N1 auditory component for syllables displayed 450 ms after picture onset relative to hearing in isolation was also observed. Data from picture naming with active listening to auditory syllables also pointed to post-lexical interference. The present study suggests that, beyond the lexical stage, post-lexical processes can be interfered and that the reciprocal interference between utterance planning and hearing relies on attentional demand and possibly competing neural substrates.

Highlights

  • Even when speakers are not actively doing another task, they can be interfered in their speech planning by concurrent auditory stimuli

  • A gradual interference was observed in naming while passive hearing with increasing stimulus onset asynchronies (SOA) as indicated by lower accuracies and slower reaction times

  • Similar to what was observed in passive hearing, the results indicate that auditory syllables interfere with picture naming, even at a late SOA

Read more

Summary

Introduction

Even when speakers are not actively doing another task, they can be interfered in their speech planning by concurrent auditory stimuli. We investigate, by combining Stimulus Onset Asynchronies (SOA) and high-density electroencephalographic (EEG) recordings, whether and when interference occurs when participants hear (but are not required to actively react upon) concurrent stimuli while they are planning speech. Driving is one of the best examples of daily situations that require sharing attention over multiple tasks, such as visual processing of traffic-related objects and auditory processing (e.g. radio programs, conversation) The sensitivity of these tasks to attentional interference has been demonstrated in several studies[11], but some[12] further indicated that the processing costs on visual tasks only emerged when participants had to act upon the concurrent auditory information. Www.nature.com/scientificreports and lexical selection[17,18] require attentional control, other processes, such as the encoding of the phonological form of the utterance and its execution, being more automatic

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call