Abstract

The central limit theorem states that, in the limits of a large number of terms, an appropriately scaled sum of independent random variables yields another random variable whose probability distribution tends to attain a stable distribution. The condition of independence, however, only holds in real systems as an approximation. To extend the theorem to more general situations, previous studies have derived a version of the central limit theorem that also holds for variables that are not independent. Here, we present numerical results that characterize how convergence is attained when the variables being summed are deterministically related to one another through the recurrent application of an ergodic mapping. In all the explored cases, the convergence to the limit distribution is slower than for random sampling. Yet, the speed at which convergence is attained varies substantially from system to system, and these variations imply differences in the way information about the deterministic nature of the dynamics is progressively lost as the number of summands increases. Some of the identified factors in shaping the convergence process are the strength of mixing induced by the mapping and the shape of the marginal distribution of each variable, most particularly, the presence of divergences or fat tails.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call