Abstract

In this paper we remark that Shannon entropy can be expressed as a function of the self-information (i.e. the logarithm) and the inverse of the Lambert $W$ function. It means that we consider that Shannon entropy has the trace form: $-k \sum_{i} W^{-1} \circ \mathsf{ln}(p_{i})$. Based on this remark we define a generalized entropy which has as a limit the Shannon entropy. In order to facilitate the reasoning this generalized entropy is obtained by a one-parameter deformation of the logarithmic function. Introducing a new concept of independence of two systems the Shannon additivity is replaced by a non-commutative and non-associative law which limit is the usual addition. The main properties associated with the generalized entropy are established, particularly those corresponding to statistical ensembles. The Boltzmann-Gibbs statistics is recovered as a limit. The connection with thermodynamics is also studied. We also provide a guideline for systematically defining a deformed algebra which limit is the classical linear algebra. As an illustrative example we study a generalized entropy based on Tsallis self-information. We point out possible connections between deformed algebra and fuzzy logics. Finally, noticing that the new concept of independence is based on t-norm the one-parameter deformation of the logarithm is interpreted as an additive generator of t-norms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call