Abstract

Atypical gaze patterns are a promising biomarker of autism spectrum disorder. To measure gaze accurately, however, it typically requires highly controlled studies in the laboratory using specialized equipment that is often expensive, thereby limiting the scalability of these approaches. Here we test whether a recently developed smartphone-based gaze estimation method could overcome such limitations and take advantage of the ubiquity of smartphones. As a proof-of-principle, we measured gaze while a small sample of well-assessed autistic participants and controls watched videos on a smartphone, both in the laboratory (with lab personnel) and in remote home settings (alone). We demonstrate that gaze data can be efficiently collected, in-home and longitudinally by participants themselves, with sufficiently high accuracy (gaze estimation error below 1° visual angle on average) for quantitative, feature-based analysis. Using this approach, we show that autistic individuals have reduced gaze time on human faces and longer gaze time on non-social features in the background, thereby reproducing established findings in autism using just smartphones and no additional hardware. Our approach provides a foundation for scaling future research with larger and more representative participant groups at vastly reduced cost, also enabling better inclusion of underserved communities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call