Abstract

Event-related potentials (ERPs) were used to investigate co-activation of English words during recognition of American Sign Language (ASL) signs. Deaf and hearing signers viewed pairs of ASL signs and judged their semantic relatedness. Half of the semantically unrelated signs had English translations that shared an orthographic and phonological rime (e.g., BAR–STAR) and half did not (e.g., NURSE–STAR). Classic N400 and behavioral semantic priming effects were observed in both groups. For hearing signers, targets in sign pairs with English rime translations elicited a smaller N400 compared to targets in pairs with unrelated English translations. In contrast, a reversed N400 effect was observed for deaf signers: target signs in English rime translation pairs elicited a larger N400 compared to targets in pairs with unrelated English translations. This reversed effect was overtaken by a later, more typical ERP priming effect for deaf signers who were aware of the manipulation. These findings provide evidence that implicit language co-activation in bimodal bilinguals is bidirectional. However, the distinct pattern of effects in deaf and hearing signers suggests that it may be modulated by differences in language proficiency and dominance as well as by asymmetric reliance on orthographic versus phonological representations.

Highlights

  • Much research on bilingualism is focused on how a bilingual’s two languages interact (e.g., [1,2]).A topic of particular interest is the extent to which one language is co-activated while the other is being processed

  • The Linear mixed effects (LME) analysis of Reaction times (RTs) indicated that deaf signers had faster responses overall compared to hearing signers, t = 3.25, 95% CI = [171.68,695.43]

  • Tables containing means and variances for each of the Event-related potentials (ERPs) effects reported below are available in the Supplementary Materials Table S2

Read more

Summary

Introduction

Much research on bilingualism is focused on how a bilingual’s two languages interact (e.g., [1,2]).A topic of particular interest is the extent to which one language is co-activated while the other is being processed. Language co-activation has been well documented in unimodal bilinguals, i.e., users of two spoken languages (e.g., [3,4,5]), fewer studies have examined this topic in bimodal bilinguals, i.e., users of both a signed and spoken/written language (see [6] for a review). Deaf bimodal bilinguals are fluent in a signed language, are literate in the written form of a spoken language, and may have varying degrees of fluency in the spoken form of that language. Hearing bimodal bilinguals are proficient in a signed language and are fluent in both the spoken and written forms of a spoken language. Studies with bimodal bilinguals can address how language co-activation occurs across modalities and whether the processes involved are the same or different from those involved in unimodal language co-activation (see [7] for discussion)

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call