Abstract

Difficulties in social interaction, verbal and non-verbal communications as well as repetitive and atypical patterns of behavior, characterizes Autism spectrum disorders (ASD). A number of studies indicated that many children with ASD prefer technology and this preference can be explored to develop systems that may alleviate several challenges of traditional treatment and intervention. As a result, recent advances in computer and robotic technology are ushering in innovative assistive technologies for ASD intervention. The current work presents design, development and a usability study of an adaptive multimodal virtual reality-based social interaction platform for children with ASD. It is hypothesized that endowing a technological system that can detect the processing pattern and mental state of the child using implicit cues from eye tracking and electrophysiological, including peripheral physiological and electroencephalography (EEG), signals and adapt its interaction accordingly is of great importance in assisting and individualizing traditional intervention approaches. The presented VR system is based on a virtual reality based social environment, a school cafeteria, where an individual with ASD interacts with virtual characters. An eye tracker, an EEG monitor and biosensors to measure peripheral electrophysiological signals are integrated with the VR task environment to obtain gaze, EEG signals and several peripheral physiological signals in real-time. In the current work, we show how eye gaze and task performance can be used in real-time to adapt intervention in VR. The other signals are collected for offline analysis. The results from a usability study with 12 subjects with ASD are presented to demonstrate the viability of the proposed concepts within the VR system.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call