Abstract

Random fields are among the mathematical models used in the study of non-deterministic and non-linear complex systems. A fundamental aspect in the characterization of such models is to define intrinsic properties and to understand how these measures change over time. In this paper, we propose an information-geometric framework to analyze Gaussian–Markov random fields (GMRF’s) by the definition of a Fisher information based space. With this approach, it is possible to measure the variations in each component of the metric tensor that equips the underlying parametric space, when visiting different entropic states. Using Markov Chain Monte Carlo simulations, we propose a method based on infinitesimal displacements to compute distances between two systems operating in different regimes using the Fisher metric. Moreover, we derive an expression for the KL-divergence (relative entropy) between two GMRF models, showing that, it symmetrized version works reasonably well as a replacement for the Fisher information based distance. Finally, information cycles that relate components of the metric tensor and the system’s entropy reveal an asymmetric pattern of evolution when the system moves towards different entropic states, indicating that the simple interaction between several Gaussian random variables can lead to the emergence of an intrinsic notion of time in the evolution of a random field, based in the geometric properties of its parametric space.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call