Abstract

ObjectiveAuditory brain–computer interfaces are an assistive technology that can restore communication for motor impaired end-users. Such non-visual brain–computer interface paradigms are of particular importance for end-users that may lose or have lost gaze control. We attempted to show that motor impaired end-users can learn to control an auditory speller on the basis of event-related potentials. MethodsFive end-users with motor impairments, two of whom with additional visual impairments, participated in five sessions. We applied a newly developed auditory brain–computer interface paradigm with natural sounds and directional cues. ResultsThree of five end-users learned to select symbols using this method. Averaged over all five end-users the information transfer rate increased by more than 1800% from the first session (0.17bits/min) to the last session (3.08bits/min). The two best end-users achieved information transfer rates of 5.78bits/min and accuracies of 92%. ConclusionsOur results show that an auditory BCI with a combination of natural sounds and directional cues, can be controlled by end-users with motor impairment. Training improves the performance of end-users to the level of healthy controls. SignificanceTo our knowledge, this is the first time end-users with motor impairments controlled an auditory brain–computer interface speller with such high accuracy and information transfer rates. Further, our results demonstrate that operating a BCI with event-related potentials benefits from training and specifically end-users may require more than one session to develop their full potential.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call