Abstract

This paper investigates the possibility of using visual imagery tasks, which are mental imagery tasks that involve imagining the images of objects perceptually without seeing them, as a control paradigm that can increase the control’s dimensionality of electroencephalography (EEG)-based brain-computer interfaces. Specifically, we propose an EEG-based approach for decoding visually imagined objects by using the Choi-Williams time-frequency distribution to analyze the EEG signals in the joint time-frequency domain and extract a set of twelve time-frequency features (TFFs). The extracted TFFs are used to construct a multi-class support vector machine classifier to decode various visually imagined objects. To validate the performance of our proposed approach, we have recorded an EEG dataset for 16 healthy subjects while imagining objects that belong to four different categories, namely nature (fruits and animals), decimal digits, English alphabet (capital letters), and arrow shapes (arrows with different colors and orientations). Moreover, we have designed two performance evaluation analyses, namely the channel-based analysis and feature-based analysis, to quantify the impact of utilizing different groups of EEG electrodes that cover various regions on the scalp and the effect of reducing the dimensionality of the extracted TFFs on the performance of our proposed approach in decoding the imagined objects within each of the four categories. The experimental results demonstrate the efficacy of our proposed approach in decoding visually imagined objects. Particularly, the average decoding accuracies obtained for each of the four categories were as high as 96.67%, 93.64%, 88.95%, and 92.68%.

Highlights

  • A brain-computer interface (BCI) is a system that analyzes the electrophysiological signals of the brain and produces commands that reflect the user’s mental activities [1]–[4]

  • For G1, the average classification accuracy (CA) and F1 values computed for each of the five classification scenarios based on the time-frequency features (TFFs) extracted from the EEG electrodes comprised in subgroup 2, which cover the frontal pole region of the brain, are substantially higher than the average CA and F1 values obtained based on the TFFs extracted from the EEG electrodes comprised within each of the other five different subgroups of G1

  • CA and F1 values computed for each of the five classification scenarios based on the TFFs extracted from the EEG electrodes of subgroup 1, subgroup 3, and subgroup 4 are relatively close to each other and are higher than the average CA and F1 values obtained based on using the TFFs extracted from the EEG electrodes of subgroup 5 and subgroup 6

Read more

Summary

Introduction

A brain-computer interface (BCI) is a system that analyzes the electrophysiological signals of the brain and produces commands that reflect the user’s mental activities [1]–[4]. Over the past two decades, researchers have developed BCI systems for different application domains, such as assisting individuals with severe motor-impairments [1], [5]–[7], recognizing emotional states [8], and detecting pain [9]–[11] In this regard, various types of neuroimaging modalities have been utilized to record brain activities and design BCI. Systems [4], such as the functional magnetic resonance imaging (fMRI), positron emission tomography (PET), electroencephalography (EEG), and electrocorticography (ECoG) Among these different neuroimaging modalities, the EEG neuroimaging modality, which records the electrical activities of the brain that are measured at the scalp [4], has been widely employed to design BCI systems in various application domains [4], [12].

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.