Abstract

We describe how to analyze the wide class of non-stationary processes with stationary centered increments using Shannon information theory. To do so, we use a practical viewpoint and define ersatz quantities from time-averaged probability distributions. These ersatz versions of entropy, mutual information, and entropy rate can be estimated when only a single realization of the process is available. We abundantly illustrate our approach by analyzing Gaussian and non-Gaussian self-similar signals, as well as multi-fractal signals. Using Gaussian signals allows us to check that our approach is robust in the sense that all quantities behave as expected from analytical derivations. Using the stationarity (independence on the integration time) of the ersatz entropy rate, we show that this quantity is not only able to fine probe the self-similarity of the process, but also offers a new way to quantify the multi-fractality.

Highlights

  • Many real-world processes, like global weather data, water reservoir levels, biological or medical signals, economic time series, etc., are intrinsicaly non-stationary [1,2,3,4,5]: their probability density function (PDF) deforms when time evolves

  • Exactly as for the fractional Brownian motion (fBm), the standard deviation is large for the ersatz entropy and the ersatz auto-mutual information, while it is much smaller for the ersatz entropy rate

  • One can estimate the Hurst exponent of a perfectly self-similar process as the slope of the linear fit in ln τ of the ersatz entropy rate. This is a valid approach for the fBm and the motion built from the noise constructed with the even-Hermitian transformation, because the ersatz entropy rate behaves linearly in ln τ

Read more

Summary

Introduction

Many real-world processes, like global weather data, water reservoir levels, biological or medical signals, economic time series, etc., are intrinsicaly non-stationary [1,2,3,4,5]: their probability density function (PDF) deforms when time evolves. Analyzing such processes requires a stationary hypothesis in order to apply classical analysis, like, e.g., two-point correlations assessment [6].

Non-Stationary Processes with Stationary Increments
General Framework
Shannon Entropy
Mutual Information and Auto-Mutual Information
Entropy Rate
Practical Time-Averaged Framework
Time-Averaged Framework
Practical Framework
Information Theory Quantities in the Practical Framework
Self-Similar Processes
Benchmarking the Practical Framework with the fBm
Characterization of the Estimates
Procedure
Standard Deviation of the Estimates
Dependence on Times T and τ
Entropy and Auto-Mutual Information
Entropy Rate Dependence on Scale τ
Bias and Standard Deviation
Application of the Practical Framework to a Multifractal Process
Discussion and Conclusions
), References
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.