Abstract

Objectives: Develop an effective and intuitive Graphical User Interface (GUI) for a Brain–Computer Interface (BCI) system, that achieves high classification accuracy and Information Transfer Rates (ITRs), while using a simple classification technique. Objectives also include the development of an output device, that is capable of real time execution of the selected commands. Methods:A region based T9 BCI system with familiar face presentation cues capable of eliciting strong P300 responses was developed. Electroencephalogram (EEG) signals were collected from the Oz, POz, CPz and Cz electrode locations on the scalp and subsequently filtered, averaged and used to extract two features. These feature sets were classified using the Nearest Neighbour Approach (NNA). To complement the developed BCI system, a ‘drone prototype’ capable of simulating six different movements, each over a range of eight distinct selectable distances, was also developed. This was achieved through the construction of a body with 4 movable legs, capable of tilting the main body forward, backward, up and down, as well as a pointer capable of turning left and right. Results:From ten participants, with normal or corrected to normal vision, an average accuracy of 91.3 ± 4.8% and an ITR of 2.2 ± 1.1 commands/minute (12.2 ± 6.0 bits/minute) was achieved. Conclusion:The proposed system was shown to elicit strong P300 responses. When compared to similar P300 BCI systems, which utilise a variety of more complex classifiers, competitive accuracy and ITR results were achieved, implying the superiority of the proposed GUI. Significance: This study supports the hypothesis that more research, time and care should be taken when developing GUIs for BCI systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call