Abstract

Complexity is often envisaged as the impossibility of reconstructing the whole of a system from the knowledge of its parts. When a probabilistic description is in order, a mathematically rigorous way to formalize this intuition is to rely on the principle of maximum entropy as a tool to infer probability distributions from structural or observational constraints. This thesis aims at evaluating this heuristic criterion in three different contexts. First, we consider the case where the transition matrix generating a discrete and finite Markov process has to be rebuilt from observed autocorrelation, with an emphasis on short historical samples. Second, we examine how maximum entropy methods and information theory can be linked to complexity as usually expressed in the particular context of cellular automata. The last part reconsiders key assumptions underlying kinetic theory of gases from the perspective of information theory, aiming in particular at generalizing Boltzmann's molecular chaos hypothesis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call