Abstract

Brain-computer interface (BCI) based on motor imagery (MI) electroencephalogram (EEG) has become an essential way for rehabilitation, because of the activation and interaction of motor neurons between the brain and rehabilitation devices in recent years. However, due to the discrepancies between individuals, the frequency ranges can be different even for the same rhythm component of EEG recordings, which brings difficulties to the extraction of features for MI classification. Typical algorithms for MI classification such as common spatial patterns (CSP) require multi-channel analysis and lack frequency information. With the development of BCI, the single-channel BCI system has become indispensable for simplicity of use. However, the currently available single-channel detection methods have low classification accuracy. To address this issue, two novel frameworks based on an improved two-dimensional nonlinear FitzHugh-Nagumo (FHN) neuron system are proposed to extract features of the single-channel MI. To evaluate the effectiveness of the proposed methods, this research utilized an open-access database (BCI competition IV dataset 2a), an offline database, and a 10-fold cross-validation procedure. Experimental results showed that the improved nonlinear FHN system can transfer the energy of noise into MI, thereby effectively enhancing the time-frequency energy. Compared with the traditional methods, the proposed methods can achieve higher classification accuracy and robustness.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.