Abstract

The main purpose of the paper is to extend the results of Ellerman (Int. J. Semant. Comput. 7:121–145, 2013) to the case of dynamical systems. We define the logical entropy and conditional logical entropy of finite measurable partitions and derive the basic properties of these measures. Subsequently, the suggested concept of logical entropy of finite measurable partitions is used to define the logical entropy of a dynamical system. It is proved that two metrically isomorphic dynamical systems have the same logical entropy. Finally, we provide a logical version of the Kolmogorov–Sinai theorem on generators. So it is shown that by replacing the Shannon entropy function by the logical entropy function we obtain the results analogous to the case of classical Kolmogorov–Sinai entropy theory of dynamical systems.

Highlights

  • The concept of entropy plays the central role in information theory [2]; the entropy quantifies the amount of information involved in the value of the outcome of a random process.The entropy has found applications in other areas, including physics, computer science, statistics, chemistry, biology, sociology, general systems theory and many others and, in addition, the whole new technology and telecommunications industry is based on this quantification of information

  • The main aim of this paper is to extend the study of logical entropy presented in [1] to the case of dynamical systems; by replacing the Shannon entropy function (1.1) by the logical entropy function (1.2) we construct an isomorphism theory of the Kolmogorov–Sinai type

  • Remark 4.2 From Theorem 4.6 it follows that if hL(T1) = hL(T2), the corresponding dynamical systems ( 1, S1, μ1, T1), ( 2, S2, μ2, T2) are metrically non-isomorphic. This means that the logical entropy distinguishes metrically non-isomorphic dynamical systems; so we have acquired an alternative tool for distinguishing non-isomorphic dynamical systems

Read more

Summary

Introduction

The concept of entropy plays the central role in information theory [2]; the entropy quantifies the amount of information involved in the value of the outcome of a random process. The Kolmogorov–Sinai entropy [4,5,6,7] provides an important generalization of Shannon entropy; it has strongly influenced understanding of the complexity of dynamical systems. To address some specific problems, it is preferable to use instead of Shannon entropy an approach based on the concept of logical entropy [1, 8, 9] (see [10,11,12,13,14,15,16]). The main aim of this paper is to extend the study of logical entropy presented in [1] to the case of dynamical systems; by replacing the Shannon entropy function (1.1) by the logical entropy function (1.2) we construct an isomorphism theory of the Kolmogorov–Sinai type. It is proved that metrically isomorphic dynamical systems have the same logical entropy.

Preliminaries
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call