Abstract

Entropic dynamics is a framework in which the laws of dynamics are derived as an application of entropic methods of inference. Its successes include the derivation of quantum mechanics and quantum field theory from probabilistic principles. Here, we develop the entropic dynamics of a system, the state of which is described by a probability distribution. Thus, the dynamics unfolds on a statistical manifold that is automatically endowed by a metric structure provided by information geometry. The curvature of the manifold has a significant influence. We focus our dynamics on the statistical manifold of Gibbs distributions (also known as canonical distributions or the exponential family). The model includes an “entropic” notion of time that is tailored to the system under study; the system is its own clock. As one might expect that entropic time is intrinsically directional; there is a natural arrow of time that is led by entropic considerations. As illustrative examples, we discuss dynamics on a space of Gaussians and the discrete three-state system.

Highlights

  • The original method of Maximum Entropy (MaxEnt) is usually associated with the names of Shannon [1] and Jaynes [2,3,4,5], its roots can be traced to Gibbs [6]

  • The article is organized, as follows: the section discusses the space of Gibbs distributions and its geometric properties; Section 3 considers the ideas of Entropic Dynamics (ED); Section 4 tackles the difficulties associated with formulating ED on the curved space of probability distributions; Section 5 introduces the notion of entropic time; Section 6 describes the evolution of the system in the form of a differential equation; in Section 7, we offer two illustrative examples of ED on a Gaussian manifold and on a two-simplex

  • ∂A j which shows that are the coordinates A and λ related through a Legendre transformation, which is a consequence of entropy maximization, and through a vector-covector duality, i.e., −dλi is the covector dual to dAi, which is a consequence of information geometry

Read more

Summary

Introduction

The original method of Maximum Entropy (MaxEnt) is usually associated with the names of Shannon [1] and Jaynes [2,3,4,5], its roots can be traced to Gibbs [6]. The second requires a scheme for keeping track of how a large number of these short steps accumulate to produce a finite motion It is the latter task that involves the introduction of the concept of time. The article is organized, as follows: the section discusses the space of Gibbs distributions and its geometric properties; Section 3 considers the ideas of ED; Section 4 tackles the difficulties associated with formulating ED on the curved space of probability distributions; Section 5 introduces the notion of entropic time; Section 6 describes the evolution of the system in the form of a differential equation; in Section 7, we offer two illustrative examples of ED on a Gaussian manifold and on a two-simplex

Gibbs Distributions
Information Geometry
Change Happens
The Prior
The Constraints
Maximizing the Entropy
The Transition Probability
Introducing Time
The Entropic Arrow of Time
Calibrating the Clock
Diffusion and the Fokker–Planck Equation
Examples
A Gaussian Manifold
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call