Abstract

The degradation of the subject-independent classification on a brain-computer interface is a challenging issue. One method mostly taken to overcome this problem is by collecting as many subjects as possible and then training the system across all subjects. This article introduces streaming online learning called autonomous deep learning (ADL) to classify five individual fingers based on electroencephalography (EEG) signals to overcome the issue above. ADL is a deep learning architecture that can construct its structure by itself through streaming learning and adapt its structure to the changes occurring in the input. In this article, the input of ADL is a common spatial pattern (CSP) extracted from the EEG signal of healthy subjects. The experimental results on the subject-dependence classification across four subjects using 5fold cross-validation show that that ADL achieved the classification accuracy of around 77%. This performance was excellent compared to a random forest (RF) and a convolutional neural network (CNN). They achieved accuracies of about 53% and 72%, respectively. On the subject-independent classification, ADL outperforms CNN by resulting stable accuracies for both training and testing, different from CNN that experience accuracy degradation to approximately 50%. These results imply that ADL is a promising machine learning in dealing with the issue in the subject-independent classification.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.