Abstract

Biological motion observation has been recognized to produce dynamic change in sensorimotor activation according to the observed kinematics. Physical plausibility of the spatial-kinematic relationship of human movement may play a major role in the top-down processing of human motion recognition. Here, we investigated the time course of scalp activation during observation of human gait in order to extract and use it on future integrated brain-computer interface using virtual reality (VR). We analyzed event related potentials (ERP), the event related spectral perturbation (ERSP) and the inter-trial coherence (ITC) from high-density EEG recording during video display onset (−200–600 ms) and the steady state visual evoked potentials (SSVEP) inside the video of human walking 3D-animation in three conditions: Normal; Upside-down (inverted images); and Uncoordinated (pseudo-randomly mixed images). We found that early visual evoked response P120 was decreased in Upside-down condition. The N170 and P300b amplitudes were decreased in Uncoordinated condition. In Upside-down and Uncoordinated conditions, we found decreased alpha power and theta phase-locking. As regards gamma oscillation, power was increased during the Upside-down animation and decreased during the Uncoordinated animation. An SSVEP-like response oscillating at about 10 Hz was also described showing that the oscillating pattern is enhanced 300 ms after the heel strike event only in the Normal but not in the Upside-down condition. Our results are consistent with most of previous point-light display studies, further supporting possible use of virtual reality for neurofeedback applications.

Highlights

  • Neuronal processing of the visual system allows us to perceive objects, movements, colors, contrasts, and to represent the space around us with a very high resolution

  • We first checked event related potentials (ERP) and event related spectral perturbation (ERSP) of [−1000; 3000] ms epoch, and we focused on events related to animation onset, and state visual evoked potentials (SSVEP) centered on heel strike between −200 and 600 ms

  • EVENT-RELATED POTENTIAL The first noticeable ERP component referenced to the earlobe elicited after the onset of the virtual reality (VR)-animation was the P120 component recorded in occipito-parietal electrodes

Read more

Summary

Introduction

Neuronal processing of the visual system allows us to perceive objects, movements, colors, contrasts, and to represent the space around us with a very high resolution. In addition to the classical dichotomy between the ventral stream (the “What” pathway) supporting object vision and a dorsal stream (the “Where” pathway), a more recent conception based on clinical evidence (Kravitz et al, 2011) divides the dorsal stream into three sub-pathways projecting on to the premotor (supporting visually-guided actions), the prefrontal and the medial temporal lobes (supporting spatial working memory) both directly and through the posterior cingulate and retrosplenial areas (supporting navigation). Behavioral, neuroimaging and Frontiers in Systems Neuroscience www.frontiersin.org

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call