Abstract

A bilingual's language system is highly interactive. When hearing a second language (L2), bilinguals access native-language (L1) words that share sounds across languages. In the present study, we examine whether input modality and L2 proficiency moderate the extent to which bilinguals activate L1 phonotactic constraints (i.e., rules for combining speech sounds) during L2 processing. Eye-movements of English monolinguals and Spanish-English bilinguals were tracked as they searched for a target English word in a visual display. On critical trials, displays included a target that conflicted with the Spanish vowel-onset rule (e.g., sp a), as well as a competitor containing the potentially-activated 'e' onset (e.g., e gg). The rule violation was processed either in the visual modality (Experiment 1) or audio-visually (Experiment 2). In both experiments, bilinguals with lower L2 proficiency made more eye movements to competitors than fillers. Findings suggest that bilinguals who have lower L2 proficiency access L1 phonotactic constraints during L2 visual word processing with and without auditory input of the constraint-conflicting structure (e.g., spa). We conclude that the interactivity between a bilingual's two languages is not limited to words that share form across languages, but also extends to sub-lexical, rule-based structures.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call