Abstract

Gaze-independent event-related potential (ERP) based brain-computer interfaces (BCIs) yield relatively low BCI performance and traditionally employ unimodal stimuli. Bimodal ERP-BCIs may increase BCI performance due to multisensory integration or summation in the brain. An additional advantage of bimodal BCIs may be that the user can choose which modality or modalities to attend to. We studied bimodal, visual-tactile, gaze-independent BCIs and investigated whether or not ERP components’ tAUCs and subsequent classification accuracies are increased for (1) bimodal vs. unimodal stimuli; (2) location-congruent vs. location-incongruent bimodal stimuli; and (3) attending to both modalities vs. to either one modality. We observed an enhanced bimodal (compared to unimodal) P300 tAUC, which appeared to be positively affected by location-congruency (p = 0.056) and resulted in higher classification accuracies. Attending either to one or to both modalities of the bimodal location-congruent stimuli resulted in differences between ERP components, but not in classification performance. We conclude that location-congruent bimodal stimuli improve ERP-BCIs, and offer the user the possibility to switch the attended modality without losing performance.

Highlights

  • Event-related potential (ERP) based brain-computer interfaces (BCIs) can be used to actively and voluntarily control a system, e.g., for communication (Farwell and Donchin, 1988) or navigation (Bell et al, 2008; Thurlings et al, 2010)

  • EFFECTS OF LOCATION-CONGRUENCY ON THE BIMODAL ERP-BCI In case we find a benefit of bimodal compared to unimodal stimulus presentation and attending, as hypothesized in the previous section, it is relevant to know whether or not that effect depends on the spatial relation within the bimodal stimulus pairs

  • We found an indication that location-congruency positively affects the late ERP component tAUC in response to bimodal stimuli (p = 0.056), and this trend corresponded to increased classification accuracies

Read more

Summary

Introduction

Event-related potential (ERP) based brain-computer interfaces (BCIs) can be used to actively and voluntarily control a system, e.g., for communication (Farwell and Donchin, 1988) or navigation (Bell et al, 2008; Thurlings et al, 2010). The user can select an option by attending to the corresponding stimulus (target) while ignoring other stimuli (nontargets). Stimulus-locked brain responses (ERPs) differ between the attended targets and ignored nontargets. When the user does not directly gaze at the target but only covertly attends to it, the high-level endogenous ERP components but not the low-level perceptual ERP-components differ from those of nontargets. This results in a reduced BCI performance in terms of classification accuracy (and bitrate) (Brunner et al, 2010; Treder and Blankertz, 2010). In that paradigm participants are required to directly focus at the visual stimuli

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.