Abstract

Some quite recent results from the area of Deterministic Chaos have considerable significance for practitioners of Information Theory. In 1959 Kolmogorov observed that Shannon’s probabilistic theory of information could be applied to symbolic encodings of the phase-space descriptions of physical non-linear dynamical systems so that one might characterise a process in terms of its Kolmogorov-Sinai entropy. Pesin’s theorem in 1977, proved that for certain deterministic non-linear dynamical systems exhibiting chaotic behaviour, the Kolmogorov-Sinai entropy h KS is given by the sum of the positive Lyapunov exponents for the process, i.e. h KS = ∑ i λ + i [3]. For a number of simple non-linear processes the Lyapunov exponents may be computed very precisely. Thus a non-linear dynamical systems may be viewed as an information source whose corresponding source entropy is accurately known. The existence of simple ‘calibrated’ sources such as the logistic map (ẋ = f(x) = rx(1−x)), [3] provides a means for precisely evaluating the performance of compression schemes, but also information measures such as the grammar based measures described in [1][4]. In respect of [1][2], the authors have computed the average T-entropy from sample encodings of the symbolic dynamics for the logistic map, and compared these values directly with the corresponding known Lyapunov exponents. As the Figure below shows, the average T-entropy for strings of 10 bits long closely agree with the positive Lyapunov exponents for the one dimensional dynamical system. The values are plotted here as a function of the system parameter r at increments of 0.0001 . The difference between the T-entropy and Lyapunov exponents averages about 1% RMS of full scale over the whole range, r ∈ [0, ln(2)]. Such agreement may be interpreted as strong evidence of the link between this grammar based information measure for finite strings and the probabilistic entropy measure of Shannon for information sources. That the process imbues individual finite sample strings with its corresponding information characteristics, echoes the our quotation from Kolmogorov. Clearly the T-entropy reflects the fine structure of chaotic attractors. Thus Deterministic Information Theory [2] appears to offer a new approach for evaluating the limits of compression for individual finite strings, while Deterministic Chaos Theory results further allow one to select precisely calibrated information sources to be used to assess the performance of specific compression algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call