Abstract

Based on a diffusion-like master equation we propose a formula using the Bregman divergence for measuring entropic distance in terms of different non-extensive entropy expressions. We obtain the non-extensivity parameter range for a universal approach to the stationary distribution by simple diffusive dynamics for the Tsallis and the Kaniadakis entropies, for the Hanel-Thurner generalization, and finally for a recently suggested log-log type entropy formula which belongs to diverging variance in the inverse temperature superstatistics.

Highlights

  • Over the last decades, there have been several suggestions for generalizations of the Boltzmann–Gibbs–Shannon (BGS) entropy formula [1,2,3,4,5,6]

  • In the present paper we investigate whether such entropy formulas define an entropic distance between two probability distributions, which has the following useful properties: 1. it is positive for any two different distributions; 2. it is zero for comparing any distribution with itself; 3. it is symmetric

  • In the following we show this behavior for the traditional logarithmic entropy formula and a generally state-dependent nearest neighbour master equation, and propose a generalization of the symmetrized entropic distance measure based on the deformed logarithm function

Read more

Summary

Introduction

There have been several suggestions for generalizations of the Boltzmann–Gibbs–Shannon (BGS) entropy formula [1,2,3,4,5,6]. Most formulas can be grouped into categories either by their mathematical form (trace form or a function of the trace form) [7], or by the scaling properties for large systems, usually providing large entropies, S, even if not necessarily proportional to the logarithm of the number of states, ln W [8,9,10,11]. In the following we show this behavior for the traditional logarithmic entropy formula and a generally state-dependent nearest neighbour master equation (defined below), and propose a generalization of the symmetrized entropic distance measure based on the deformed logarithm function. We conclude that some proposals encertain a shrinking of the entropic distance during an approach to the stationary distribution only for a restricted range of the non-extensivity parameter(s) used in the entropy formula

Probability Distributions
Master Equation
Entropic Distance
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call