Abstract

We investigate the basic question of information theory, namely, evaluation of Shannon entropy, and a more general Renyi (1961) entropy, for some discrete distributions (e.g., binomial, negative binomial, etc.). We aim at establishing analytic methods (i.e., those in which complex analysis plays a pivotal role) for such computations which often yield estimates of unparalleled precision. The main analytic tool used here is that of analytic poissonization and depoissonization. We illustrate our approach on the entropy evaluation of the binomial distribution, that is, we prove that for binomial (n, p) distribution Shannon's h/sub n/ becomes h/sub n//spl ap/ 1/2 ln n+ 1/2 +ln/spl radic/(2/spl pi/p(1-p))+/spl Sigma//sub k/spl ges/1/a/sub k/n/sup -k/ where a/sub k/ are explicitly computable constants. Moreover, we argue that analytic methods (e.g., complex asymptotics such as Rice's method and singularity analysis, Mellin transforms, poissonization, and depoissonization) can offer new tools for information theory, especially for studying second-order asymptotics (e.g., redundancy). In fact, there has been a resurgence of interest and a few successful applications of analytic methods to a variety of problems of information theory, therefore, we propose to name such investigations as analytic information theory.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call