Abstract
Motor imagery (MI)-based brain-computer interfaces (BCIs) using electroencephalography (EEG) have found practical applications in external device control. However, the non-stationary nature of EEG signals remains to obstruct BCI performance across multiple sessions, even for the same user. In this study, we aim to address the impact of non-stationarity, also known as inter-session variability, on multi-session MI classification performance by introducing a novel approach, the relevant session-transfer (RST) method. Leveraging the cosine similarity as a benchmark, the RST method transfers relevant EEG data from the previous session to the current one. The effectiveness of the proposed RST method was investigated through performance comparisons with the self-calibrating method, which uses only the data from the current session, and the whole-session transfer method, which utilizes data from all prior sessions. We validated the effectiveness of these methods using two datasets: a large MI public dataset (Shu Dataset) and our own dataset of gait-related MI, which includes both healthy participants and individuals with spinal cord injuries. Our experimental results revealed that the proposed RST method leads to a 2.29 % improvement (p < 0.001) in the Shu Dataset and up to a 6.37 % improvement in our dataset when compared to the self-calibrating method. Moreover, our method surpassed the performance of the recent highest-performing method that utilized the Shu Dataset, providing further support for the efficacy of the RST method in improving multi-session MI classification performance. Consequently, our findings confirm that the proposed RST method can improve classification performance across multiple sessions in practical MI-BCIs.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.