Abstract
Sign languages demonstrate a higher degree of iconicity than spoken languages. Studies on a number of unrelated sign languages show that the event structure of verb signs is reflected in the phonological form of the signs (Wilbur (2008), Malaia & Wilbur (2012), Krebs et al. (2021)). Previous research showed that hearing non-signers (with no prior exposure to sign language) can use the iconicity inherent in the visual dynamics of a verb sign to correctly identify its event structure (telic vs. atelic). In two EEG experiments, hearing non-signers were presented with telic and atelic verb signs unfamiliar to them, which they had to classify in a two-choice lexical decision task in their native language. The first experiment assessed the timeline of neural processing mechanisms in non-signers processing telic/atelic signs without access to lip-reading cues in their native language, to understand the pathways for incorporation of physical perceptual motion features into linguistic processing. The second experiment further probed the impact of visual information provided by lip-reading (speech decoding based on visual information from the face of the speaker, most importantly, the lips) on the processing of telic/atelic signs in non-signers.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.