Abstract

When multiple persons speak simultaneously, it may be difficult for the listener to direct attention to correct sound objects among conflicting ones. This could occur, for example, in an emergency situation in which one hears conflicting instructions and the loudest, instead of the wisest, voice prevails. Here, we used cortically-constrained oscillatory MEG/EEG estimates to examine how different brain regions, including caudal anterior cingulate (cACC) and dorsolateral prefrontal cortices (DLPFC), work together to resolve these kinds of auditory conflicts. During an auditory flanker interference task, subjects were presented with sound patterns consisting of three different voices, from three different directions (45° left, straight ahead, 45° right), sounding out either the letters “A” or “O”. They were asked to discriminate which sound was presented centrally and ignore the flanking distracters that were phonetically either congruent (50%) or incongruent (50%) with the target. Our cortical MEG/EEG oscillatory estimates demonstrated a direct relationship between performance and brain activity, showing that efficient conflict resolution, as measured with reduced conflict-induced RT lags, is predicted by theta/alpha phase coupling between cACC and right lateral frontal cortex regions intersecting the right frontal eye fields (FEF) and DLPFC, as well as by increased pre-stimulus gamma (60–110 Hz) power in the left inferior fontal cortex. Notably, cACC connectivity patterns that correlated with behavioral conflict-resolution measures were found during both the pre-stimulus and the pre-response periods. Our data provide evidence that, instead of being only transiently activated upon conflict detection, cACC is involved in sustained engagement of attentional resources required for effective sound object selection performance.

Highlights

  • Speech perception in everyday acoustic environments is a considerable computational challenge

  • Consistent with the hypothesis that efficient conflict processing depends on cingulofrontal connectivity, we found that reduced conflict-induced reaction time (RT) lags correlated with increased 8-Hz alpha phase locking between the bilateral caudal ACC (cACC) seeds and right lateral prefrontal regions

  • Consistent with our hypothesis, the results demonstrate that efficient conflict resolution during sound object selection, as measured with reduced conflict-induced RT lags, is predicted by theta/alpha phase coupling between the cACC and right lateral frontal cortex regions intersecting frontal eye fields (FEF) and dorsolateral prefrontal cortices (DLPFC)

Read more

Summary

Introduction

Speech perception in everyday acoustic environments is a considerable computational challenge. Even if the relevant and irrelevant sounds can be perceptually grouped into distinct objects, similarity of these objects can make it difficult for the listener to direct attention to correct object in the auditory scene (i.e., object selection) [1]. This could happen, for example, when one hears conflicting pieces of advice in an emergency situation, and the inherently most salient instead of most relevant object tends to prevail. The question about how different brain regions work together to select relevant objects while ignoring conflicting stimuli in a multitalker environment falls within the cognitive realm of conflict processing. Given the methodological limitations, such as the poor temporal resolution of the prevailing neuroimaging technique of fMRI, it is not completely clear how these regions work together as a functional network during conflict processing

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call