Abstract

Automatic classification of electroencephalography (EEG) signals, for different type of mental activities, is an active area of research and has many applications such as brain computer interface (BCI) and medical diagnoses. We introduce a simple yet effective way to use Kullback-Leibler (KL) divergence in the classification of raw EEG signals. We show that k-nearest neighbor (k-NN) algorithm with KL divergence as the distance measure, when used using our feature vectors, gives competitive classification accuracy and consistently outperforms the more commonly used Euclidean k-NN. We also develop and demonstrate the use of a KL-based kernel to classify EEG data using support vector machines (SVMs). Our KL-distance based kernel compares favorably to other well established kernels such as linear and radial basis function (RBF) kernel. The EEG data, used in our experiments for classification, was recorded while the subject performed 5 different mental activities such as math problem solving, letter composing, 3-D block rotation, counting and resting (baseline). We present classification results for this data set that are obtained by using raw EEG data with no explicit artifact removal in the pre-processing steps.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call