Abstract

The neural dynamics underlying the coordination of spatially-directed limb and eye movements in humans is not well understood. Part of the difficulty has been a lack of signal processing tools suitable for the analysis of nonstationary electroencephalographic (EEG) signals. Here, we use multivariate empirical mode decomposition (MEMD), a data-driven approach that does not employ predefined basis functions. High-density EEG, and arm and eye movements were synchronously recorded in 10 subjects performing time-constrained reaching and/or eye movements. Subjects were allowed to move both the hand and the eyes, only the hand, or only the eyes following a 500-700 ms delay interval where the hand and gaze remained on a central fixation cross. An additional condition involved a nonspatially-directed "lift" movement of the hand. The neural activity during a 500 ms delay interval was decomposed into intrinsic mode functions (IMFs) using MEMD. Classification analysis revealed that gamma band (30 Hz) IMFs produced more classifiable features differentiating the EEG according to the different upcoming movements. A benchmark test using conventional algorithms demonstrated that MEMD was the best algorithm for extracting oscillatory bands from EEG, yielding the best classification of the different movement conditions. The gamma rhythm decomposed using MEMD showed a higher correlation with the eventual movement accuracy than any other band rhythm and than any other algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call