We consider the problem of computing information-theoretic functions, such as entropy, on a data stream, using sublinear space. Our first result deals with a measure we call the _entropy norm_ of an input stream: it is closely related to entropy but is structurally similar to the well-studied notion of frequency moments. We give a polylogarithmic-space, one-pass algorithm for estimating this norm under certain conditions on the input stream. We also prove a lower bound that rules out such an algorithm if these conditions do not hold. Our second group of results is for estimating the empirical entropy of an input stream. We first present a sublinear-space, one-pass algorithm for this problem. For a stream of _m_ items and a given real parameter α, our algorithm uses space _Õ_(_m_<sup>2α</sup>) and provides an approximation of 1/α in the worst case and (1+ε) in "most" cases. We then present a two-pass, polylogarithmic-space, (1+ε)-approximation algorithm. All our algorithms are quite simple.
Read full abstract