Abstract
An advantageous method for understanding complexity is information geometry theory. In particular, a dimensionless distance, called information length , permits us to describe time-varying, non-equilibrium processes by measuring the total change in the information along the evolution path of a stochastic variable or the total number of statistically different states the variable passes through in time. Here, we elucidate the meaning of information length and information rate in light of thermodynamics (entropy production rate , non-equilibrium free energy , microscopic chemical potential μ, etc). In particular, the average ⟨∂ t μ⟩ gives the average rate of work (power) while the second moment is proportional to Γ2. Here, the angular brackets denote average and V is the potential. The upper bound on the entropy production rate is set by the product of Γ and the RMS value of the fluctuating part δμ = μ − ⟨μ⟩. Specifically, in the case of the non-autonomous Ornstein–Uhlenbeck process for a stochastic variable x, we show that is bounded above by Γ2 up to the fluctuation normalization σ 2 = ⟨(δx)2⟩ where σ is the standard deviation and δx = x − ⟨x⟩ is the fluctuating component of x. The equality holds in the (isothermal) case where σ and the temperature D are constant. We discuss the implications of as a proxy for the entropy production along an evolution path and understanding self-organization.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Statistical Mechanics: Theory and Experiment
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.