Abstract

Worldwide, millions of people are locked in or in a wheelchair, due to several neuromuscular disorders or spinal cord injuries. These individuals are deprived of trivial social activities, like interacting or playing games with other people. Such activities are crucial for personal development, and can have a great impact on the quality of their lives. This work aims at the design and implementation of an electroencephalography (EEG) based motor imagery (MI) brain computer interface (BCI) system that would allow disabled, and able-bodied, individuals alike to control a drone in a 3D physical environment by only using their thoughts. An improved version of the filter bank common spatial pattern (FBCSP) algorithm was developed, and it has shown to perform superior (68.5% accuracy) to the winning FBCSP algorithm (67.8% accuracy), when tested on dataset 2a (4 class MI) of the BCI competition IV. A deep convolutional neural network (CNN) based algorithm was also implemented and tested on the same dataset, which however performed inferior (62.9% accuracy) to the winner, as well as our proposed FBCSP algorithms. The improved FBCSP was then tested on our in-house 5-class (left hand, right hand, tongue, both feet and rest) MI dataset (collected from 10 able-bodied subjects) and obtained a mean accuracy of 41.8±11.74%. This is considered a significant result though it is not good enough to attempt the control of a real drone.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call