Abstract

Brain-computer interfaces (BCIs) records brain activity using electroencephalogram (EEG) headsets in the form of EEG signals; these signals can be recorded, processed and classified into different hand movements, which can be used to control other IoT devices. Classification of hand movements will be one step closer to applying these algorithms in real-life situations using EEG headsets. This paper uses different feature extraction techniques and sophisticated machine learning algorithms to classify hand movements from EEG brain signals to control prosthetic hands for amputated persons. To achieve good classification accuracy, denoising and feature extraction of EEG signals is a significant step. We saw a considerable increase in all the machine learning models when the moving average filter was applied to the raw EEG data. Feature extraction techniques like a fast fourier transform (FFT) and continuous wave transform (CWT) were used in this study; three types of features were extracted, i.e., FFT Features, CWT Coefficients and CWT scalogram images. We trained and compared different machine learning (ML) models like logistic regression, random forest, k-nearest neighbors (KNN), light gradient boosting machine (GBM) and XG boost on FFT and CWT features and deep learning (DL) models like VGG-16, DenseNet201 and ResNet50 trained on CWT scalogram images. XG Boost with FFT features gave the maximum accuracy of 88%.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.