Abstract

The enormous amount of currently available data demands efforts to extract meaningful information. For this purpose, different measurements are applied, including Shannon's entropy, permutation entropy, and the Lempel-Ziv complexity. These methods have been used in many applications, such as pattern recognition, series classification, and several other areas (e.g., physical, financial, and biomedical). Data in these applications are often presented in binary series with temporal correlations. Herein, we compare the measures of information entropy in binary series conveying short- and long-range temporal correlations characterized by the Hurst exponent H. Combining numerical and analytical approaches, we scrutinize different methods that were not efficient in detecting temporal correlations. To surpass this limitation, we propose a measure called the binary permutation index (BPI). We will demonstrate that BPI efficiently discriminates patterns embedded in the series, offering advantages over previous methods. Subsequently, we collect stock market time series and rain precipitation data as well as perform in vivo electrophysiological recordings in the hippocampus of an experimental animal model of temporal lobe epilepsy, in which the BPI application in both public open source and experimental data is demonstrated. An index is proposed to evaluate information entropy, allowing the ability to discriminate randomness and extract meaningful information in binary time series.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call