Abstract
In this paper, we introduce the Kullback–Leibler information function ρ ( ν , μ ) and prove the local large deviation principle for σ-finite measures μ and finitely additive probability measures ν. In particular, the entropy of a continuous probability distribution ν on the real axis is interpreted as the exponential rate of asymptotics for the Lebesgue measure of the set of those samples that generate empirical measures close to ν in a suitable fine topology.
Highlights
Let P be a continuous probability distribution on the real axis with density φ( x ) = dP( x )/dx.Its entropy is defined as Z H ( P) = −φ( x ) ln φ( x ) dx. (1) RWhat is the substantive sense of H ( P)? More precisely, does there exist a mathematical object whose natural quantitative magnitude is a certain function of the entropy?Traditionally, entropy is treated as a measure of disorder
Entropy is treated as a measure of disorder
In the case of a metric space X supplied with a Borel σ-field, the neighborhood O(ν) in (6) can be chosen from the weak topology generated by bounded continuous functions
Summary
It makes sense to consider finitely additive probability distributions P as well since some sequences of empirical measures may converge to finitely additive distributions In such a case, the Kullback action can take values +∞ or −∞ only (Theorem 6). The Kullback action ρ( P, Q) coincides (up to the sign) with entropy (2) It was revealed in [21,22] that, for the “counting” measure Q on the countable space X, the ordinary form of the large deviation principle, formulated in terms of the weak topology, fails and so one should use the fine topology instead.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have