Abstract
Brain-computer interfaces (BCIs) are a rapidly expanding field of study and require accurate and reliable real-time decoding of patterns of neural activity. These protocols often exploit selective attention, a neural mechanism that prioritises the sensory processing of task-relevant stimulus features (feature-based attention) or task-relevant spatial locations (spatial attention). Within the visual modality, attentional modulation of neural responses to different inputs is well indexed by steady-state visual evoked potentials (SSVEPs). These signals are reliably present in single-trial electroencephalography (EEG) data, are largely resilient to common EEG artifacts, and allow separation of neural responses to numerous concurrently presented visual stimuli. To date, efforts to use single-trial SSVEPs to classify visual attention for BCI control have largely focused on spatial attention rather than feature-based attention. Here, we present a dataset that allows for the development and benchmarking of algorithms to classify feature-based attention using single-trial EEG data. The dataset includes EEG and behavioural responses from 30 healthy human participants who performed a feature-based motion discrimination task on frequency tagged visual stimuli.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.