Abstract

Atypical eye gaze to social stimuli is one of the most frequently reported and studied social behaviors affected by autism spectrum disorder (ASD). The vast majority of this literature is based on analyses of gaze patterns as participants view social information, such as talking faces, on a computer screen. However, recent results suggest that generalizing gaze behaviors from computer screens to live interactions may not be valid. This study examines between- and within-group differences in gaze behaviors of children with ASD and their neurotypical (NT) peers during a screen-based and a live-interaction task. Results show between-group differences in gaze only for the screen-based, but not the live-interaction task. We also find that gaze behavior of NT children during the screen-based task significantly correlates with their gaze behavior during the live interaction; individuals who direct a higher percentage of gaze to the face in one task also did so in the other task. However, there is no significant relationship between the gaze patterns of children with ASD for those two tasks. These results strongly caution against using gaze of individuals with ASD recorded during screen-based tasks as a proxy for understanding their gaze behavior during live social interactions.

Highlights

  • With the advent of eyetracking technology, researchers began turning their attention to interpreting more fine-grain features of gaze behavior

  • TM We used SensoMotoric Instruments (SMI ) software to draw a dynamic area of interest (AOI) on the face of each adolescent in the screen-based stimulus videos for the entire duration of their respective one-minute narratives

  • The shape of the AOIs changed dynamically throughout the video to conform to the height and width of the face on the screen, which varied depending on the speaker’s precise distance to the camera and angle

Read more

Summary

Introduction

With the advent of eyetracking technology, researchers began turning their attention to interpreting more fine-grain features of gaze behavior. Eye contact with a conversation partner during a live interaction significantly improved the ability of NT children to encode a sequence of random digits, but had no such effect on children with ASD20 Given these findings, the impact of face-to-face interaction on social behavior may be different for individuals who have a documented deficit in reciprocal social communication. When participants viewed a video of a speaker using the same gaze directions and the same conversation topics as during the live interaction, there was no effect of speaker gaze on listener gaze This pattern of results gives further credence to the idea that gaze responses to live vs screen-based social stimuli cannot be equated. These data collectively demonstrate the complexity of untangling the relative contributions of speaker behavior, participant characteristics, interactional factors, and task design for our understanding of social gaze during live interactions

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call