Abstract

We used phonological priming and ERPs to investigate the organization of the lexicon in American Sign Language. Across go/no-go repetition detection and semantic categorization tasks, targets in related pairs that shared handshape and location elicited smaller N400s than targets in unrelated pairs, indicative of facilitated processing. Handshape-related targets also elicited smaller N400s than unrelated targets, but only in the repetition task. The location priming effect reversed direction across tasks, with slightlylargeramplitude N400s for targets in related versus unrelated pairs in the semantic task, indicative of interference. These patterns imply that handshape and location play different roles during sign recognition and that there is a hierarchical organization for the sign lexicon. Similar to interactive-activation models of word recognition, we argue for differentiation between sublexical facilitation and lexical competition. Lexical competition is primarily driven by the location parameter and is more engaged when identification of single lexico-semantic entries is required.

Highlights

  • Signs, like spoken words, are composed of a discrete set of sublexical phonological units that yield contrastive minimal pairs

  • The goal of the present study was to improve our understanding of how the sign language lexicon is organized and to delineate the layered processes that unfold during sign recognition

  • We investigated how the type and degree of phonological overlap influences sign recognition in the context of two tasks that require different levels of lexico-semantic processing

Read more

Summary

Introduction

Like spoken words, are composed of a discrete set of sublexical phonological units that yield contrastive minimal pairs. The three pri­ mary phonological units, or parameters, in sign languages are location, handshape, and movement (for reviews of sign language phonology, see Brentari, 2019; Sandler & Lillo-Martin, 2006). The investigation of the in­ fluence of phonological overlap in priming paradigms has contributed to our understanding of the processes involved in auditory and visual word recognition (for reviews see, e.g., Dufour, 2008; McQueen & Sereno, 2005). The literature regarding phono­ logical priming in sign language has not been systematic; drawing conclusions often requires comparisons across studies that differ in terms of stimuli, language, and task, among other important variables. We used event-related potentials (ERPs) to clarify the contributions of handshape and location parameters to phonological priming across two tasks that require different levels of lexico-semantic processing. The overall goals of the study were to better understand sign recognition processes and the organization of the sign language lexicon

Objectives
Methods
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call