Abstract

The Khinchin–Shannon generalized inequalities for entropy measures in Information Theory, are a paradigm which can be used to test the Synergy of the distributions of probabilities of occurrence in physical systems. The rich algebraic structure associated with the introduction of escort probabilities seems to be essential for deriving these inequalities for the two-parameter Sharma–Mittal set of entropy measures. We also emphasize the derivation of these inequalities for the special cases of one-parameter Havrda–Charvat’s, Rényi’s and Landsberg–Vedral’s entropy measures.

Highlights

  • In the present contribution we derive the Generalized Khinchin–Shannon inequalities (GKS) [1,2] associated to entropy measures of the Sharma–Mittal (SM) set [3]

  • We stress that the derivations to be presented here are a tentative way of implementing the ideas of the literature on interdisciplinary topics of Statistical Mechanics and Theory of Information [4,5,6]

  • We present a proposal for Information measure associated to SM entropies and we derive its related inequalities [10]. At this point we stress once more the upsurge of the synergy effect on the comparison of the information obtained from the entropy calculated with joint probabilities of occurrence and the entropies corresponding to simple probabilities

Read more

Summary

Introduction

In the present contribution we derive the Generalized Khinchin–Shannon inequalities (GKS) [1,2] associated to entropy measures of the Sharma–Mittal (SM) set [3]. A detailed study is undertaken to treat the eventual ordering between the probabilities of occurrence and their associated escort probabilities This is enough for deriving the GKS inequalities for the SM entropy measures. At this point we stress once more the upsurge of the synergy effect on the comparison of the information obtained from the entropy calculated with joint probabilities of occurrence and the entropies corresponding to simple probabilities. Jt−1 , respectively, if the values associated to at in the jth column are given a priori. This means that: a1 ,...,at−1 p j1 ...jt

The Assumption of Concavity and the Synergy of Gibbs–Shannon
An Information Measure Proposal Associated to Sharma–Mittal Entropy Measures
Concluding Remarks
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call