Abstract
We investigated the relevance of linguistic and perceptual factors to sign processing by comparing hearing individuals and deaf signers as they performed a handshape monitoring task, a sign-language analogue to the phoneme-monitoring paradigms used in many spoken-language studies. Each subject saw a series of brief video clips, each of which showed either an American Sign Language (ASL) sign or a phonologically possible but nonlexical “nonsign,” and responded when the viewed action was formed with a particular handshape. Stimuli varied with respect to the factors of Lexicality, handshape Markedness, and Type, defined according to whether the action is performed with one or two hands and for two-handed stimuli, whether or not the action is symmetrical. Deaf signers performed faster and more accurately than did hearing nonsigners, and effects related to handshape Markedness and stimulus Type were observed in both groups. However, no effects or interactions related to Lexicality were seen. A further analysis restricted to the deaf group indicated that these results were not dependent upon subjects’ age of acquisition of ASL. This work provides new insights into the processes by which the handshape component of sign forms is recognised in a sign language, the role of language experience, and the extent to which these processes may or may not be considered specifically linguistic.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.