Abstract

In this paper we propose a quantification of distributions on a set of strings, in terms of how close to pseudorandom a distribution is. The quantification is an adaptation of the theory of dimension of sets of infinite sequences introduced by Lutz. Adapting Hitchcock's work, we also show that the logarithmic loss incurred by a predictor on a distribution is quantitatively equivalent to the notion of dimension we define. Roughly, this captures the equivalence between pseudorandomness defined via indistinguishability and via unpredictability. Later we show some natural properties of our notion of dimension. We also do a comparative study among our proposed notion of dimension and two well known notions of computational analogue of entropy, namely HILL-type pseudo min-entropy and next-bit pseudo Shannon entropy. Further, we apply our quantification to the following problem. If we know that the dimension of a distribution on the set of n-length strings is s in (0,1], can we extract out O(sn) pseudorandom bits out of the distribution? We show that to construct such extractor, one need at least Omega(log n) bits of pure randomness. However, it is still open to do the same using O(log n) random bits. We show that deterministic extraction is possible in a special case - analogous to the bit-fixing sources introduced by Chor et al., which we term nonpseudorandom bit-fixing source. We adapt the techniques of Gabizon, Raz and Shaltiel to construct a deterministic pseudorandom extractor for this source. By the end, we make a little progress towards P vs. BPP problem by showing that existence of optimal stretching function that stretches O(log n) input bits to produce n output bits such that output distribution has dimension s in (0,1], implies P=BPP.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call