Abstract

Two new information-theoretic methods are introduced for establishing Poisson approximation inequalities. First, using only elementary information-theoretic techniques it is shown that, when S/sub n/=/spl Sigma//sub i=1//sup n/X/sub i/ is the sum of the (possibly dependent) binary random variables X/sub 1/,X/sub 2/,...,X/sub n/, with E(X/sub i/)=p/sub i/ and E(S/sub n/)=/spl lambda/, then D(P(S/sub n/)/spl par/Po(/spl lambda/)) /spl les//spl Sigma//sub i=1//sup n/p/sub i//sup 2/+[/spl Sigma//sub i=1//sup n/H(X/sub i/)-H(X/sub 1/,X/sub 2/,...,X/sub n/)] where D(P(S/sub n/)/spl par/Po(/spl lambda/)) is the relative entropy between the distribution of S/sub n/ and the Poisson (/spl lambda/) distribution. The first term in this bound measures the individual smallness of the X/sub i/ and the second term measures their dependence. A general method is outlined for obtaining corresponding bounds when approximating the distribution of a sum of general discrete random variables by an infinitely divisible distribution. Second, in the particular case when the X/sub i/ are independent, the following sharper bound is established: D(P(S/sub n/)/spl par/Po(/spl lambda/))/spl les/1//spl lambda/ /spl Sigma//sub i=1//sup n/ ((p/sub i//sup 3/)/(1-p/sub i/)) and it is also generalized to the case when the X/sub i/ are general integer-valued random variables. Its proof is based on the derivation of a subadditivity property for a new discrete version of the Fisher information, and uses a recent logarithmic Sobolev inequality for the Poisson distribution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call