Abstract

In recent studies a number of research groups have determined that human electroencephalograms (EEG) have scaling properties. In particular, a crossover between two regions with different scaling exponents has been reported. Herein we study the time evolution of diffusion entropy to elucidate the scaling of EEG time series. For a cohort of 20 awake healthy volunteers with closed eyes, we find that the diffusion entropy of EEG increments (obtained from EEG waveforms by differencing) exhibits three features: short-time growth, an alpha wave related oscillation whose amplitude gradually decays in time, and asymptotic saturation which is achieved after approximately 1 s. This analysis suggests a linear, stochastic Ornstein-Uhlenbeck Langevin equation with a quasiperiodic forcing (whose frequency and/or amplitude may vary in time) as the model for the underlying dynamics. This model captures the salient properties of EEG dynamics. In particular, both the experimental and simulated EEG time series exhibit short-time scaling which is broken by a strong periodic component, such as alpha waves. The saturation of EEG diffusion entropy precludes the existence of asymptotic scaling. We find that the crossover between two scaling regions seen in detrended fluctuation analysis (DFA) of EEG increments does not originate from the underlying dynamics but is merely an artifact of the algorithm. This artifact is rooted in the failure of the "trend plus signal" paradigm of DFA.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call