Abstract
Brain-computer interfaces (BCI) harnessing steady state visual evoked potentials (SSVEPs) manipulate the frequency and phase of visual stimuli to generate predictable oscillations in neural activity. For BCI spellers, oscillations are matched with alphanumeric characters allowing users to select target numbers and letters. Advances in BCI spellers can, in part, be accredited to subject-specific optimization, including; 1) custom electrode arrangements; 2) filter sub-band assessments; and 3) stimulus parameter tuning. Here, we apply deep convolutional neural networks (DCNNs) demonstrating cross-subject functionality for the classification of frequency and phase encoded SSVEP. Electroencephalogram (EEG) data are collected and classified using the same parameters across subjects. Subjects fixate forty randomly cued flickering characters ( 5 ×8 keyboard array) during concurrent wet-EEG acquisition. These data are provided by an open source SSVEP dataset. Our proposed DCNN, PodNet, achieves 86% and 77% offline accuracy of classification across-subjects for two data capture periods, respectively, 6-seconds (information transfer rate = 40 bpm) and 2-seconds (information transfer rate = 101 bpm). Subjects demonstrating sub-optimal (<70%) performance are classified to similar levels after a short subject-specific training period. PodNet outperforms filter-bank canonical correlation analysis for a low volume (3-channel) clinically feasible occipital electrode configuration. The networks defined in this study achieve functional performance for the largest number of SSVEP classes decoded via DCNN to date. Our results demonstrate PodNet achieves cross-subject, calibrationless classification and adaptability to sub-optimal subject data, and low-volume EEG electrode arrangements.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Neural Systems and Rehabilitation Engineering
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.