Abstract

Facial expressions in sign language carry a variety of communicative features. While emotion can modulate a spoken utterance through changes in intonation, duration and intensity, in sign language specific facial expressions presented concurrently with a manual sign perform this function. When deaf adult signers cannot see facial features, their ability to judge emotion in a signed utterance is impaired (Reilly et al. in Sign Lang Stud 75:113–118, 1992). We examined the role of the face in the comprehension of emotion in sign language in a group of typically developing (TD) deaf children and in a group of deaf children with autism spectrum disorder (ASD). We replicated Reilly et al.’s (Sign Lang Stud 75:113–118, 1992) adult results in the TD deaf signing children, confirming the importance of the face in understanding emotion in sign language. The ASD group performed more poorly on the emotion recognition task than the TD children. The deaf children with ASD showed a deficit in emotion recognition during sign language processing analogous to the deficit in vocal emotion recognition that has been observed in hearing children with ASD.

Highlights

  • In order to recognize emotions in spoken language, hearing individuals use both visual cues and auditory cues, such as changes in the frequency, intonation, intensity and rate of speech (Most and Michaelis 2012)

  • Et al.’s (1992) finding for adult signers is replicated in children aged 8.5–16.5 years indicating that the use of facial actions to interpret the emotional meaning of a signed utterance is established by this age

  • The second main finding is that the autism spectrum disorder (ASD) group were less accurate in their judgments of emotion compared to typically developing (TD) deaf children

Read more

Summary

Introduction

In order to recognize emotions in spoken language, hearing individuals use both visual cues (such as facial expressions and body posture) and auditory cues, such as changes in the frequency, intonation, intensity and rate of speech (Most and Michaelis 2012). For deaf individuals, emotional information must be conveyed in sign language using only visual cues. These can be found in the movement and positioning of the hands, face, eyes, torso, shoulders etc. Studies examining where deaf individuals look during sign language comprehension have demonstrated that the face is attended to more than other visual cues, including the hands (Agrafiotis et al 2003; Emmorey et al 2009). There may be a number of reasons for this, for example the face provides linguistic and social information as well as cues for lip reading (Letourneau and Mitchell 2011), one may be that important emotional information is conveyed by the face and that signers need to pay particular attention to facial cues in the absence of tone of voice information and other auditory cues (Reilly et al 1990)

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call