Abstract

Common spatial pattern (CSP) analysis and its extensions have been widely used as feature extraction approaches in the brain-computer interfaces (BCIs). However, most of the CSP-based approaches do not use any prior knowledge that might be available about the two conditions (classes) to be classified. Therefore, their applications are limited to datasets that contain enough variance information about the two conditions. For example, in some event-related potential (ERP) detection applications, such as P300 speller, the information is in the time domain but not in the variance of spatial components. To address this problem, first, we present a novel feature extraction method termed extended common spatial pattern (ECSP) analysis, which uses prior knowledge available from data to produce a broader range of features than that of conventional CSP analysis. Then, similarly, we introduce the extended common temporal pattern (ECTP) analysis. Finally, to exploit both spatial and temporal information, we propose extended common spatial and temporal pattern (ECSTP) analysis. We have used BCI competition III, dataset II as our main dataset to evaluate our proposed methods. In addition, we used two other datasets, namely BCI competition II, dataset IIb and BCI competition IV, dataset IIb, to further evaluate the performance of the proposed methods. In All the datasets, the proposed methods significantly outperform the conventional CSP, CTP, and CSTP methods. More specifically, ECSTP has the best performance among the proposed methods. Moreover, classification results show that the proposed methods are competitive with other state of the art methods applied to these datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.