Abstract

Permutation Entropy (PE) is a powerful tool for measuring the amount of information contained within a time series. However, this technique is rarely applied directly on raw signals. Instead, a preprocessing step, such as linear filtering, is applied in order to remove noise or to isolate specific frequency bands. In the current work, we aimed at outlining the effect of linear filter preprocessing in the final PE values. By means of the Wiener–Khinchin theorem, we theoretically characterize the linear filter’s intrinsic PE and separated its contribution from the signal’s ordinal information. We tested these results by means of simulated signals, subject to a variety of linear filters such as the moving average, Butterworth, and Chebyshev type I. The PE results from simulations closely resembled our predicted results for all tested filters, which validated our theoretical propositions. More importantly, when we applied linear filters to signals with inner correlations, we were able to theoretically decouple the signal-specific contribution from that induced by the linear filter. Therefore, by providing a proper framework of PE linear filter characterization, we improved the PE interpretation by identifying possible artifact information introduced by the preprocessing steps.

Highlights

  • Information entropy was first proposed by Shannon in his seminal paper “A Mathematical Theory of Communication” [1]

  • Each signal was subject to a variety of lowpass, highpass, and bandpass linear filters

  • To test the theoretical results, we applied a series of lowpass, highpass, and bandpass linear filters on the uncorrelated signals, such as white Gaussian noise and white uniform noise

Read more

Summary

Introduction

Information entropy was first proposed by Shannon in his seminal paper “A Mathematical Theory of Communication” [1]. This measure can effectively assess the amount of “surprise” (new information) contained in any given instance of the result of a random variable with a known distribution function. One noteworthy example is Permutation Entropy (PE) [10], which measures the distribution of the ordinal patterns instead of the cardinal values of the signal. This approach is robust to noise and computationally efficient. It requires no prior knowledge of the signal’s internal structure to successfully measure its information content

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call