Abstract
We investigated whether the covert orienting of visuospatial attention can be effectively used in a brain-computer interface guided by event-related potentials. Three visual interfaces were tested: one interface that activated voluntary orienting of visuospatial attention and two interfaces that elicited automatic orienting of visuospatial attention. We used two epoch classification procedures. The online epoch classification was performed via Independent Component Analysis, and then it was followed by fixed features extraction and support vector machines classification. The offline epoch classification was performed by means of a genetic algorithm that permitted us to retrieve the relevant features of the signal, and then to categorise the features with a logistic classifier. The offline classification, but not the online one, allowed us to differentiate between the performances of the interface that required voluntary orienting of visuospatial attention and those that required automatic orienting of visuospatial attention. The offline classification revealed an advantage of the participants in using the “voluntary” interface. This advantage was further supported, for the first time, by neurophysiological data. Moreover, epoch analysis was performed better with the “genetic algorithm classifier” than with the “independent component analysis classifier”. We suggest that the combined use of voluntary orienting of visuospatial attention and of a classifier that permits feature extraction ad personam (i.e., genetic algorithm classifier) can lead to a more efficient control of visual BCIs.
Highlights
Farwell and Donchin [1] first investigated the possibility of participants to communicate by means of event-related potentials (ERPs; e.g., P300), without the involvement of the peripheral nervous system and the voluntary muscle activity
We investigated the possibility to modulate the performance of an ERP-based brain-computer interfaces (BCIs) system, by designing and implementing three new interfaces, in which participants were required to perform covert orienting of visuospatial attention [37,39]
We analyzed whether the performances obtained by 12 healthy participants using three new interfaces in an ERP-based visual BCI [35] were influenced by the specific classification system
Summary
Farwell and Donchin [1] first investigated the possibility of participants to communicate by means of event-related potentials (ERPs; e.g., P300), without the involvement of the peripheral nervous system and the voluntary muscle activity. This is possible through brain-computer interfaces (BCIs), systems that permit users to translate their brain signals directly into commands for controlling external devices [2]. Brain signals are digitized and analyzed by specific algorithms for extracting specific features Afterwards, these features are classified, and they are translated into commands. Users must try to modulate their mental states (e.g., concentrate on the target stimulus and ignore the non-target ones) to obtain the desired effect on the device
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.