Abstract
Sensory and language experience can affect brain organization and domain-general abilities. For example, D/deaf individuals show superior visual perception compared to hearing controls in several domains, including the perception of faces and peripheral motion. While these enhancements may result from sensory loss and subsequent neural plasticity, they may also reflect experience using a visual-manual language, like American Sign Language (ASL), where signers must process moving hand signs and facial cues simultaneously. In an effort to disentangle these concurrent sensory experiences, we examined how learning sign language influences visual abilities by comparing bimodal bilinguals (i.e., sign language users with typical hearing) and hearing non-signers. Bimodal bilinguals and hearing non-signers completed online psychophysical measures of face matching and biological motion discrimination. No significant group differences were observed across these two tasks, suggesting that sign language experience is insufficient to induce perceptual advantages in typical-hearing adults. However, ASL proficiency (but not years of experience or age of acquisition) was found to predict performance on the motion perception task among bimodal bilinguals. Overall, the results presented here highlight a need for more nuanced study of how linguistic environments, sensory experience, and cognitive functions impact broad perceptual processes and underlying neural correlates.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.