Abstract

Spatially-coded steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI) uses the distribution of SSVEP to infer the gaze position of the user relative to stimulus blocks. Unlike conventional frequency-coded BCIs, spatially-coded BCI can encode multiple targets with a few frequencies and does not require the user to gaze directly at flickering light. The previous spatially-coded BCI recognition methods required training data to train the recognition model and had a long preparation time, hindering practical applications. This study proposed spatially-coded filter bank canonical correlation analysis (SCFBCCA) that does not require pre-training, enabling a six-target spatially-coded BCI without pre-training. There were two experiments in this study, each with different visual stimulation. In Experiment 1, we proposed an eigenvalue that can be used to classify the horizontal position of targets and proved the feasibility of methods that do not require training data. Experiment 1 also obtained some conclusions that were favorable to the visual stimulation design. Following the conclusions of Experiment 1, a six-target spatially-coded BCI and its recognition method were designed in Experiment 2. Finally, in the six-target classification experiment, the average accuracy reached 61.6 ± 3.8 % for five subjects. For the first time, a spatially-coded BCI without pre-training was completed. Its accuracy was similar to spatially-coded BCIs that use a small amount of training data. Meanwhile, the preparation time of BCI in this study was very short, which was more conducive to practical application.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.