Abstract

Entropy estimation of information sources is highly non trivial for symbol sequences with strong long-range correlations. The rabbit sequence, related to the symbolic dynamics of the nonlinear circle map at the critical point as well as the logistic map at the Feigenbaum point have been argued to exhibit long memory tails. For both dynamical systems the scaling behavior of the block entropy of order n has been shown to increase like as log(n). In contrast to probabilistic concepts, we investigate the scaling behavior of certain non-probabilistic entropy estimation schemes suggested by Lempel and Ziv in the context of algorithmic complexity and data compression. These are applied in a sequential manner with the scaling variable being the length N of the sequence. We determine the scaling law for the Lempel-Ziv entropy estimate applied to the case of the critical circle map and the logistic map at the Feigenbaum point in a binary partition.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call