Abstract
Over the past several years, steady-state visual evoked potential (SSVEP)-based brain-computer interfaces (BCIs) have attracted wide attention in the field of BCIs research due to high information transfer rate, little user training, and applicability to the majority. In conventional recognition methods for training-free SSVEP-based BCIs, the energy difference between the frequencies of electroencephalogram (EEG) background noise is usually ignored, therefore, there is a significant variance among the recognition accuracy of different stimulus frequencies. In order to improve the performance of training-free SSVEP-based BCIs system and balance the accuracy of recognition between different stimulus frequencies, a recognition method based on multitaper spectral analysis and signal-to-noise ratio estimation (MTSA-SNR) is proposed in this paper. A 40-class SSVEP public benchmark SSVEP dataset recorded from 35 subjects was used to evaluate the performance of the proposed method. Under the condition of 2.25s data length, the accuracy of the three methods were 81.1% (MTSA-SNR), 74.5% (canonical correlation analysis, CCA) and 73.4% (multivariate synchronization index, MSI), and the corresponding ITRs were 101 bits/min (MTSA-SNR), 89 bits/min (CCA), 87 bits/min (MSI). In the low frequency range (8-9.8Hz), the average recognition accuracy of the three methods is 82.9% (MTSA-SNR), 82.0% (CCA), 83.3% (MSI). The average accuracy of the three methods was 78.6% (MTSA-SNR), 64.9% (CCA) and 61.8% (MSI) in the high frequency range (14-15.8Hz). According to the results, the proposed method can effectively improve the performance of training-free SSVEP-based BCI system, and balance the recognition accuracy between different stimulation frequencies.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.