Abstract

This study examined the ability of young children with autism spectrum disorders (ASD) to detect affective correspondences between facial and vocal expressions of emotion using an intermodal matching paradigm. Four-year-old children with ASD ( n = 18) and their age-matched normally developing peers ( n = 18) were presented pairs of videotaped facial expressions accompanied by a single soundtrack matching the affect of one of the two facial expressions. In one block of trials, the emotions were portrayed by their mothers; in another block of trials, the same emotion pairs were portrayed by an unfamiliar woman. Findings showed that ASD children were able to detect the affective correspondence between facial and vocal expressions of emotion portrayed by their mothers, but not a stranger. Furthermore, in a control condition using inanimate objects and their sounds, ASD children also showed a preference for sound-matched displays. These results suggest that children with ASD do not have a general inability to detect intermodal correspondences between visual and vocal events, however, their ability to detect affective correspondences between facial and vocal expressions of emotions may be limited to familiar displays.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call