Abstract
This paper presents an EEG-based brain-computer interface system for classifying eleven motor imagery (MI) tasks within the same hand. The proposed system utilizes the Choi-Williams time-frequency distribution (CWD) to construct a time-frequency representation (TFR) of the EEG signals. The constructed TFR is used to extract five categories of time-frequency features (TFFs). The TFFs are processed using a hierarchical classification model to identify the MI task encapsulated within the EEG signals. To evaluate the performance of the proposed approach, EEG data were recorded for eighteen intact subjects and four amputated subjects while imagining to perform each of the eleven hand MI tasks. Two performance evaluation analyses, namely channel- and TFF-based analyses, are conducted to identify the best subset of EEG channels and the TFFs category, respectively, that enable the highest classification accuracy between the MI tasks. In each evaluation analysis, the hierarchical classification model is trained using two training procedures, namely subject-dependent and subject-independent procedures. These two training procedures quantify the capability of the proposed approach to capture both intra- and inter-personal variations in the EEG signals for different MI tasks within the same hand. The results demonstrate the efficacy of the approach for classifying the MI tasks within the same hand. In particular, the classification accuracies obtained for the intact and amputated subjects are as high as and , respectively, for the subject-dependent training procedure, and and , respectively, for the subject-independent training procedure. These results suggest the feasibility of applying the proposed approach to control dexterous prosthetic hands, which can be of great benefit for individuals suffering from hand amputations.
Highlights
Nowadays, many individuals are suffering from hand motor impairments due to strokes, hand amputations, and spinal cord injuries
This study aims to investigate the feasibility of using the Choi-Williams time-frequency distribution (CWD), which enables the extraction of time-frequency features (TFFs) from the EEG signals, along with a hierarchical classification model to discriminate between eleven hand motor imagery tasks (HMITs), including: rest, basic finger and wrist movements, and grasping and functional movements
The results of the channel-based and TFF-based analyses show that the proposed system achieved average accuracies of 88.8% and 80.8% for the subject-dependent training procedure (SDTP) and subject-independent training procedure (SITP), respectively, using the TFF of C1 that are extracted from the EEG channels in G1
Summary
Many individuals are suffering from hand motor impairments due to strokes, hand amputations, and spinal cord injuries. We have witnessed substantial advancements in designing and developing wearable assistive devices, such as robotic prosthetic hands and exoskeletal orthotic hands. These assistive devices can be of great. An individual who had a stroke attack can utilize an exoskeletal orthotic hand to support his/her disabled hand [1]. In this vein, brain-computer interface (BCI) systems have been employed to provide alternative non-muscular communication pathways to assist people suffering from motor disabilities or living with lost limbs to interact with their surroundings [1,2,3]
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.