Abstract

This article establishes a novel electroencephalograph (EEG)-based brain-computer interface (BCI) system for ground vehicle control with potential application of mobility assistance to the disabled. To enable an intuitive motor imagery (MI) paradigm of “left,” “right,” “push,” and “pull,” a driving simulator based EEG data recording and automatic labeling platform is built for dataset making. In the preprocessing stage, a wavelet and canonical correlation analysis (CCA) combined method is used for artifact removal and improving signal-to-noise ratio. An ensemble learning based training and testing framework is proposed for MI EEG data classification. The average classification accuracy of proposed framework is about 91.75%. This approach essentially takes advantage of the common spatial pattern (CSP) with ability of extracting the feature of event-related potentials and the convolutional neural networks (CNNs) with powerful capacity of feature learning and classification. To convert the classification results of EEG data segments into motion control signals of ground vehicle, shared control strategy is used to realize the control command of “left-steering,” “right-steering,” “acceleration,” and “stop” considering collision avoidance with obstacles detected by a single-line LIDAR. The online experimental results on a model vehicle platform validate the significant performance of the established BCI system and reveal the application potential of BCI on the vehicle control and automation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call