Abstract

Previous studies have shown that language can modulate visual perception, by biasing and/or enhancing perceptual performance. However, it is still debated where in the brain visual and linguistic information are integrated, and whether the effects of language on perception are automatic and persist even in the absence of awareness of the linguistic material. Here, we aimed to explore the automaticity of language-perception interactions and the neural loci of these interactions in an fMRI study. Participants engaged in a visual motion discrimination task (upward or downward moving dots). Before each trial, a word prime was briefly presented that implied upward or downward motion (e.g., “rise”, “fall”). These word primes strongly influenced behavior: congruent motion words sped up reaction times and improved performance relative to incongruent motion words. Neural congruency effects were only observed in the left middle temporal gyrus, showing higher activity for congruent compared to incongruent conditions. This suggests that higher-level conceptual areas rather than sensory areas are the locus of language-perception interactions. When motion words were rendered unaware by means of masking, they still affected visual motion perception, suggesting that language-perception interactions may rely on automatic feed-forward integration of perceptual and semantic material in language areas of the brain.

Highlights

  • Previous studies have shown that language can modulate visual perception, by biasing and/ or enhancing perceptual performance

  • We investigated the dependence of language-perception interactions on awareness and the neural loci of this interaction

  • In a visual motion discrimination task in which attention was explicitly directed to motion word primes, congruent motion words significantly sped up reaction times and improved performance relative to incongruent motion words

Read more

Summary

Introduction

Previous studies have shown that language can modulate visual perception, by biasing and/ or enhancing perceptual performance. Neural congruency effects were only observed in the left middle temporal gyrus, showing higher activity for congruent compared to incongruent conditions This suggests that higherlevel conceptual areas rather than sensory areas are the locus of language-perception interactions. We observed a neural counterpart of this behavioral facilitation effect in the left middle temporal gyrus (lMTG), an area involved in lexical retrieval, including word semantics and multisensory processing and integration[9]. These results are in line with an interaction of language and perception at a conceptual (semantic) processing stage. Language-perception interactions are dependent on awareness of the linguistic stimuli, i.e., if language still affects perception when participants are unaware of the motion words, in terms of brain and behavior

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.