Abstract

The dataset presented here contains recordings of electroencephalogram (EEG) and electrooculogram (EOG) from four advanced locked-in state (LIS) patients suffering from ALS (amyotrophic lateral sclerosis). These patients could no longer use commercial eye-trackers, but they could still move their eyes and used the remnant oculomotor activity to select letters to form words and sentences using a novel auditory communication system. Data were recorded from four patients during a variable range of visits (from 2 to 10), each visit comprised of 3.22 ± 1.21 days and consisted of 5.57 ± 2.61 sessions recorded per day. The patients performed a succession of different sessions, namely, Training, Feedback, Copy spelling, and Free spelling. The dataset provides an insight into the progression of ALS and presents a valuable opportunity to design and improve assistive and alternative communication technologies and brain-computer interfaces. It might also help redefine the course of progression in ALS, thereby improving clinical judgement and treatment.

Highlights

  • Background & SummaryAmyotrophic lateral sclerosis (ALS) is a neurodegenerative disorder that, in its final stages, paralyzes affected individuals impairing their ability to communicate[1,2,3,4]

  • Those patients with intact consciousness, voluntary eye movement control, who can blink their eyes or twitch their muscles are said to be in a locked-in state (LIS)[5,6]

  • In the case of patients who survive attached to life-support systems, the progression of the disease destroys oculomotor control, leading to the loss of gaze-fixation and impeding the use of eye-tracking based communication technologies[9,10,11]

Read more

Summary

Background & Summary

Amyotrophic lateral sclerosis (ALS) is a neurodegenerative disorder that, in its final stages, paralyzes affected individuals impairing their ability to communicate[1,2,3,4]. An auditory electrooculogram (EOG) based communication system[12] was developed to provide a means of communication to ALS patients without gaze-fixation and who were unable to use the commercial AAC eye-tracking devices, but who had remnant oculomotor control to form words, phrases, and sentences using the system described in Tonin & Jaramillo-Gonzalez et al.[12]. The study design and paradigm are described in detail in the Methods section This data descriptor outlines the EEG and EOG recordings from four different patients recorded during their use of the auditory communication system, having first trained progressively, and controlling the. The system can be considered successful in enabling communication, other analytical methods can still improve the system’s speed and efficiency, for example, offline testing of other feature extraction methods or testing and comparing the performance with different machine-learning methods to classify the patients’ response

Methods
Type of session
Findings
Code availability
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call