Abstract

Electroencephalography signals are recorded as multidimensional datasets. We propose a new framework based on the augmented covariance that stems from an autoregressive model to improve motor imagery classification. From the autoregressive model can be derived the Yule-Walker equations, which show the emergence of a symmetric positive definite matrix: the augmented covariance matrix. The state-of the art for classifying covariance matrices is based on Riemannian Geometry. A fairly natural idea is therefore to apply this Riemannian Geometry based approach to these augmented covariance matrices. The methodology for creating the augmented covariance matrix shows a natural connection with the delay embedding theorem proposed by Takens for dynamical systems. Such an embedding method is based on the knowledge of two parameters: the delay and the embedding dimension, respectively related to the lag and the order of the autoregressive model. This approach provides new methods to compute the hyper-parameters in addition to standard grid search. The augmented covariance matrix performed ACMs better than any state-of-the-art methods. We will test our approach on several datasets and several subjects using the MOABB framework, using both within-session and cross-session evaluation. The improvement in results is due to the fact that the augmented covariance matrix incorporates not only spatial but also temporal information. As such, it contains information on the nonlinear components of the signal through the embedding procedure, which allows the leveraging of dynamical systems algorithms. These results extend the concepts and the results of the Riemannian distance based classification algorithm.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.