The analytic information theory of discrete distributions was initiated in 1998 by C. Knessl, P. Jacquet and S. Szpankowski who addressed the precise evaluation of the Renyi and Shannon entropies of the Poisson, Pascal (or negative binomial) and binomial distributions. They were able to derive various asymptotic approximations and, at times, lower and upper bounds for these quantities. Here, we extend these investigations in a twofold way. First, we consider a much larger class of distributions, the Rakhmanov distributions , where denote the sequences of discrete hypergeometric-type polynomials that are orthogonal with respect to the weight function ω(x) of Poisson, Pascal, binomial and hypergeometric types, i.e. the polynomials of Charlier, Meixner, Kravchuk and Hahn. Second, we obtain the explicit expressions for the relative Fisher information of these four families of Rakhmanov distributions with respect to their respective weight functions.
Read full abstract