Abstract
More and more works deal with statistical systems far from equilibrium, dominated by unidirectional stochastic processes, augmented by rare resets. We analyze the construction of the entropic distance measure appropriate for such dynamics. We demonstrate that a power-like nonlinearity in the state probability in the master equation naturally leads to the Tsallis (Havrda–Charvát, Aczél–Daróczy) q-entropy formula in the context of seeking for the maximal entropy state at stationarity. A few possible applications of a certain simple and linear master equation to phenomena studied in statistical physics are listed at the end.
Highlights
More and more works deal with statistical systems far from equilibrium, dominated by unidirectional stochastic processes, augmented by rare resets
In this paper, we explore a reverse engineering concept: seeking an entropic divergence formula, which is subject to some wanted properties, we consider entropy as a derived quantity
Summarizing, in this paper we have presented a construction strategy for the entropic distance formula, designed to shrink for a given wide class of stochastic dynamics
Summary
Dealing with the dynamics of classical probabilities, we would like to propose a general recipe for defining the corresponding formula for the entropic divergence between two probability distributions. This definition is not symmetric in the handling of the normalized distributions Pn and Qn , it is an easy task to consider the symmetrized version, s[ P, Q] ≡ ρ[ P, Q] + ρ[ Q, P] This symmetrized, entropic divergence inherits some properties from the fiducial construction. The consequences, listed below, can be derived from these general relations: s(1) = 2 σ (1) = 0, s0 (1) = σ (1) = 0, s00 > 0 ⇒ ξ m = 1 is a minimum, s(ξ ) ≥ 0 In this way the kernel function, and each summand in the symmetrized entropic divergence formula, is non-negative, the total sum
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have