Abstract

Brain-computer interfaces (BCIs) allow individuals with limited speech and physical abilities to communicate with the surrounding environment. Such BCIs require calibration sessions which is burdensome for such individuals. We introduce a transfer learning approach for our novel hybrid BCI in which brain electrical activity and cerebral blood velocity are recorded simultaneously using Electroencephalography (EEG) and functional transcranial Doppler ultrasound (fTCD) respectively in response to flickering mental rotation (MR) and word generation (WG) tasks. With the aim of reducing the calibration requirements, for each BCI user, we used mutual information to identify the top similar datasets collected from other users. Using these datasets and the dataset of the current user, features derived from power spectrum of EEG and fTCD signals were calculated. Mutual information and support vector machines were used for feature selection and classification. Using the hybrid combination, an average accuracy of 93.04% was achieved for MR versus baseline whereas WG versus baseline yielded average accuracy of 90.94%. As for MR versus WG, an average accuracy of 92.64% was obtained by hybrid combination compared to 88.14% obtained by EEG only. Average bit rates of 11.45, 17.24, and 19.72 bits/min were achieved for MR versus WG, MR versus baseline, and WG versus baseline respectively. The proposed system outperforms the state of the art EEG-fNIRS BCIs in terms of accuracy and/or bit rate.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.