Abstract
The elimination of ocular artifacts is critical in analyzing electroencephalography (EEG) data for various brain-computer interface (BCI) applications. Despite numerous promising solutions, electrooculography (EOG) recording or an eye-blink detection algorithm is required for the majority of artifact removal algorithms. This reliance can hinder the model's implementation in real-world applications. This paper proposes EEGANet, a framework based on generative adversarial networks (GANs), to address this issue as a data-driven assistive tool for ocular artifacts removal (source code is available at https://github.com/IoBT-VISTEC/EEGANet). After the model was trained, the removal of ocular artifacts could be applied calibration-free without relying on the EOG channels or the eye blink detection algorithms. First, we tested EEGANet's ability to generate multi-channel EEG signals, artifacts removal performance, and robustness using the EEG eye artifact dataset, which contains a significant degree of data fluctuation. According to the results, EEGANet is comparable to state-of-the-art approaches that utilize EOG channels for artifact removal. Moreover, we demonstrated the effectiveness of EEGANet in BCI applications utilizing two distinct datasets under inter-day and subject-independent schemes. Despite the absence of EOG signals, the classification performance of the signals processed by EEGANet is equivalent to that of traditional baseline methods. This study demonstrates the potential for further use of GANs as a data-driven artifact removal technique for any multivariate time-series bio-signal, which might be a valuable step towards building next-generation healthcare technology.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.