Abstract

Regaining communication abilities in patients who are unable to speak or move is one of the main goals in decoding brain waves for brain-computer interface (BCI) control. Many BCI approaches designed for communication rely on attention to visual stimuli, commonly applying an oddball paradigm, and require both eye movements and adequate visual acuity. These abilities may, however, be absent in patients who depend on BCI communication. We have therefore developed a response-based communication BCI, which is independent of gaze shifts but utilizes covert shifts of attention to the left or right visual field. We recorded the electroencephalogram (EEG) from 29 channels and coregistered the vertical and horizontal electrooculogram. Data-driven decoding of small attention-based differences between the hemispheres, also known as N2pc, was performed using 14 posterior channels, which are expected to reflect correlates of visual spatial attention. Eighteen healthy participants responded to 120 statements by covertly directing attention to one of two colored symbols (green and red crosses for “yes” and “no,” respectively), presented in the user’s left and right visual field, respectively, while maintaining central gaze fixation. On average across participants, 88.5% (std: 7.8%) of responses were correctly decoded online. In order to investigate the potential influence of stimulus features on accuracy, we presented the symbols with different visual angles, by altering symbol size and eccentricity. The offline analysis revealed that stimulus features have a minimal impact on the controllability of the BCI. Hence, we show with our novel approach that spatial attention to a colored symbol is a robust method with which to control a BCI, which has the potential to support severely paralyzed people with impaired eye movements and low visual acuity in communicating with their environment.

Highlights

  • A brain-computer interface (BCI) that can be controlled independently of gaze shifts could constitute a helpful assistive device for persons who suffer from severe neurological disorders

  • To further provide evidence that the BCI was not influenced by eye movements, even in the two participants showing higher EOG deflections compared to the remaining participants, we show the difference waves of the horizontal EOG (hEOG) (Figure 5C) and of the EEG signal at PO7/PO8 (Figure 5D) for these participants and compare it with the average signals from the remaining participants

  • While there is no firm evidence that it is possible to discriminate in EEG signals between thinking “yes” or “no,” we have shown that the direction of visual spatial attention can be clearly discriminated, with a 88.5% decoding accuracy, on a binary basis in EEG recordings from healthy participants

Read more

Summary

INTRODUCTION

A brain-computer interface (BCI) that can be controlled independently of gaze shifts could constitute a helpful assistive device for persons who suffer from severe neurological disorders. Classification of hemispheric differences, depending on the hemifield in which the target was presented, has been successfully applied for target detection in aerial images (Matran-Fernandez and Poli, 2017), for the detection of the tilt of Gabor patches (Xu et al, 2016) as well as in visual search for colored digits (Awni et al, 2013) and circles (Tian et al, 2019) While data in these studies were analyzed offline, to our knowledge, only one study has implemented a gaze-independent closed-loop BCI based on N2pc detection (Reichert et al, 2020a), where participants performed a twodimensional navigation task. In the BCI experiment presented here, we varied symbol sizes and eccentricities to investigate whether such stimulus features have an impact on classification accuracy, and if so, to determine the optimal set of stimulus features to prevent poor performance due to inappropriate parameter choices in future studies

Participants and Recordings
Evaluation of BCI Performance
RESULTS
Evaluation of Stimulus Features
DISCUSSION
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call