Abstract

Entropies are expressed in terms of mean values, and not as weighted arithmetic means of their generating functions, which result in pseudo-additive entropies. The Shannon entropy corresponds to the logarithm of the inverse of the geometric mean, while the Rényi entropy, more generally, to the logarithm of the inverse of power means of order τ < 1. Translation invariance of the means relates to mean code lengths, while their homogeneity translates them into entropies: the arithmetic and exponential means correspond to the Shannon and Rényi entropies, respectively, under the Kraft equality. While under the Kraft inequality, the entropies are lower bounds to the mean code lengths. Means of any order cannot be expressed as escort averages because such averages contradict the fact that the means are monotonically increasing functions of their order. Exponential entropies are shown to be measures of the extent of a distribution. The probability measure and the incomplete probability distribution are shown to be the ranges of continuous and discrete sample spaces, respectively. Comparison is made with Boltzmann's principle.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.