Abstract

A brain–computer interface (BCI) is a channel of communication that transforms brain activity into specific commands for manipulating a personal computer or other home or electrical devices. In other words, a BCI is an alternative way of interacting with the environment by using brain activity instead of muscles and nerves. For that reason, BCI systems are of high clinical value for targeted populations suffering from neurological disorders. In this paper, we present a new processing approach in three publicly available BCI data sets: (a) a well-known multi-class (N = 6) coded-modulated Visual Evoked potential (c-VEP)-based BCI system for able-bodied and disabled subjects; (b) a multi-class (N = 32) c-VEP with slow and fast stimulus representation; and (c) a steady-state Visual Evoked potential (SSVEP) multi-class (N = 5) flickering BCI system. Estimating cross-frequency coupling (CFC) and namely δ-θ [δ: (0.5–4 Hz), θ: (4–8 Hz)] phase-to-amplitude coupling (PAC) within sensor and across experimental time, we succeeded in achieving high classification accuracy and Information Transfer Rates (ITR) in the three data sets. Our approach outperformed the originally presented ITR on the three data sets. The bit rates obtained for both the disabled and able-bodied subjects reached the fastest reported level of 324 bits/min with the PAC estimator. Additionally, our approach outperformed alternative signal features such as the relative power (29.73 bits/min) and raw time series analysis (24.93 bits/min) and also the original reported bit rates of 10–25 bits/min. In the second data set, we succeeded in achieving an average ITR of 124.40 ± 11.68 for the slow 60 Hz and an average ITR of 233.99 ± 15.75 for the fast 120 Hz. In the third data set, we succeeded in achieving an average ITR of 106.44 ± 8.94. Current methodology outperforms any previous methodologies applied to each of the three free available BCI datasets.

Highlights

  • The majority of brain–computer interface (BCI) systems are based on three major types of brain signals: the event-related resynchronization which is associated with the P300, the motor-imagery and the steady-state visual evoked potentials (SSVEP) (Wolpaw et al, 2002)

  • Our main goal is to improve the performance and the bit rates focusing on the coded-modulated Visual Evoked potential (c-VEP) component of the brain activity

  • An efficient algorithmic approach was presented for two c-VEP-based BCI systems and a SSVE-BCI system with classes ranging from N = 6 to N = 32

Read more

Summary

Introduction

From the very first work of Farwell and Donchin (Farwell and Donchin, 1988), the majority of P300-based brain–computer interface (BCI) systems focused on creating new applications (Polikoff et al, 1995; Bayliss, 2003), and on constructing and testing new algorithms for the reliable detection of the P300 waveform from noisy data sets (Xu et al, 2003; Kaper et al, 2004; Rakotomamonjy et al, 2005; Thulasidas et al, 2006; Hoffmann et al, 2008). Other approaches to the traditional BCI systems are based on visual evoked potentials (VEPs) paradigms. The two methods are mostly employed to distinguish various visual targets, the phase and frequency coding (Wang et al, 2008). These VEPs are usually observed in the occipital area in response to a repetitive visual stimulus, and they encode the undergoing visual information processing in the brain. Each target is coded differently and is presented by a unique stimulus sequence. These results are unique visual responses identified in the brain activity. We will focus on a c-VEP system in which pseudorandom sequences are used for presenting the stimuli

Objectives
Findings
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.