Abstract

This paper proposes a novel device-to-device transfer-learning algorithm for reducing the calibration cost in a steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI) speller by leveraging electroencephalographic (EEG) data previously acquired by different EEG systems. The transferring is done by projecting the scalp-channel EEG signals onto a shared latent domain across devices. Three spatial filtering techniques, including channel averaging, canonical correlation analysis (CCA), and task-related component analysis (TRCA), were employed to extract the shared responses from different devices. The transferred data were integrated into a template-matching-based algorithm to detect SSVEPs. To evaluate its transferability, this paper conducted two sessions of simulated online BCI experiments with ten subjects using 40 visual stimuli modulated by joint frequency-phase coding method. In each session, two different EEG devices were used: first, the Quick-30 system (Cognionics, Inc.) with dry electrodes, and second, the ActiveTwo system (BioSemi, Inc.) with wet electrodes. The proposed method with CCA- and TRCA-based spatial filters achieved significantly higher classification accuracy compared with the calibration-free standard CCA-based method. This paper validated the feasibility and effectiveness of the proposed method in implementing calibration-free SSVEP-based BCIs. The proposed method has great potentials to enhance practicability and usability of real-world SSVEP-based BCI applications by leveraging user-specific data recorded in previous sessions even with different EEG systems and montages.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.