Abstract

The theory of Shannon entropy was applied to the Choi-Williams time-frequency distribution (CWD) of time series in order to extract entropy information in both time and frequency domains. In this way, four novel indexes were defined: (1) partial instantaneous entropy, calculated as the entropy of the CWD with respect to time by using the probability mass function at each time instant taken independently; (2) partial spectral information entropy, calculated as the entropy of the CWD with respect to frequency by using the probability mass function of each frequency value taken independently; (3) complete instantaneous entropy, calculated as the entropy of the CWD with respect to time by using the probability mass function of the entire CWD; (4) complete spectral information entropy, calculated as the entropy of the CWD with respect to frequency by using the probability mass function of the entire CWD. These indexes were tested on synthetic time series with different behavior (periodic, chaotic and random) and on a dataset of electroencephalographic (EEG) signals recorded in different states (eyes-open, eyes-closed, ictal and non-ictal activity). The results have shown that the values of these indexes tend to decrease, with different proportion, when the behavior of the synthetic signals evolved from chaos or randomness to periodicity. Statistical differences (p-value < 0.0005) were found between values of these measures comparing eyes-open and eyes-closed states and between ictal and non-ictal states in the traditional EEG frequency bands. Finally, this paper has demonstrated that the proposed measures can be useful tools to quantify the different periodic, chaotic and random components in EEG signals.

Highlights

  • Since the works of Kotelnikov and Shannon [1,2], it has been proved that the information provided by an event associated with the inverse of the probability of an event occurrence has been extremely useful for its significance and inherent conceptual simplicity.The classical Shannon entropy measures the average information provided by a set of events and proves its uncertainty

  • Four novel indexes are defined on the Choi-Williams time-frequency distribution (CWD): (1) partial instantaneous entropy, calculated as the entropy of the CWD with respect to time by using the probability mass function at each time instant taken independently; (2) partial spectral information entropy, calculated as the entropy of the CWD with respect to frequency by using the probability mass function of each frequency value taken independently; (3) complete instantaneous entropy, calculated as the entropy of the CWD with respect to time by using the probability mass function of the entire CWD; (4) complete spectral information entropy, calculated as the entropy of the CWD with respect to frequency by using the probability mass function of the entire CWD

  • Observing quantization-frequency distributions in Figures 2b,d, all information is more contained in few quantization bins in cF than pF, due to the fact that the quantization takes into account the complete CWD in cF

Read more

Summary

Introduction

Since the works of Kotelnikov and Shannon [1,2], it has been proved that the information provided by an event associated with the inverse of the probability of an event occurrence has been extremely useful for its significance and inherent conceptual simplicity. Four novel indexes are defined on the CWD: (1) partial instantaneous entropy, calculated as the entropy of the CWD with respect to time by using the probability mass function at each time instant taken independently; (2) partial spectral information entropy, calculated as the entropy of the CWD with respect to frequency by using the probability mass function of each frequency value taken independently; (3) complete instantaneous entropy, calculated as the entropy of the CWD with respect to time by using the probability mass function of the entire CWD; (4) complete spectral information entropy, calculated as the entropy of the CWD with respect to frequency by using the probability mass function of the entire CWD These indexes are tested on synthetic time series that simulate signals in which different behaviors (periodic, chaotic and random) are combined and on a dataset of electroencephalographic (EEG) signals recorded in different states (eyes-open, eyes-closed, ictal and non-ictal activity). EEG signals are selected since they are generated by nonlinear deterministic processes with nonlinear coupling interactions between neuronal populations [7]

Time-Frequency Representation
Shannon Entropies
Instantaneous Entropy and Spectral Information Entropy
Synthetic Signals
Real EEG Recordings
Real EEG Signals
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call