Abstract

We introduce a general IFS Bayesian method for getting posterior probabilities from prior probabilities, and also a generalized Bayes' rule, which will contemplate a dynamical, as well as a non-dynamical setting. Given a loss function , we detail the prior and posterior items, their consequences and exhibit several examples. Taking Θ as the set of parameters and Y as the set of data (which usually provides random samples), a general IFS is a measurable map , which can be interpreted as a family of maps . The main inspiration for the results we will get here comes from a paper by Zellner (with no dynamics), where Bayes' rule is related to a principle of minimization of information. We will show that our IFS Bayesian method which produces posterior probabilities (which are associated to holonomic probabilities) is related to the optimal solution of a variational principle, somehow corresponding to the pressure in Thermodynamic Formalism, and also to the principle of minimization of information in Information Theory. Among other results, we present the prior dynamical elements and we derive the corresponding posterior elements via the Ruelle operator of Thermodynamic Formalism; getting in this way a form of dynamical Bayes' rule.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call