Abstract

What is the value of just a few bits to a guesser? We study this problem in a setup where Alice wishes to guess an independent and identically distributed (i.i.d.) random vector and can procure a fixed number of k information bits from Bob, who has observed this vector through a memoryless channel. We are interested in the guessing ratio, which we define as the ratio of Alice’s guessing-moments with and without observing Bob’s bits. For the case of a uniform binary vector observed through a binary symmetric channel, we provide two upper bounds on the guessing ratio by analyzing the performance of the dictator (for general ) and majority functions (for ). We further provide a lower bound via maximum entropy (for general ) and a lower bound based on Fourier-analytic/hypercontractivity arguments (for ). We then extend our maximum entropy argument to give a lower bound on the guessing ratio for a general channel with a binary uniform input that is expressed using the strong data-processing inequality constant of the reverse channel. We compute this bound for the binary erasure channel and conjecture that greedy dictator functions achieve the optimal guessing ratio.

Highlights

  • In the classical guessing problem, Alice wishes to learn the value of a discrete random variable (r.v.) X as quickly as possible by sequentially asking yes/no questions of the form “Is X = x?”, until she makes a correct guess

  • It is well known and simple to verify that the guessing strategy which simultaneously minimizes all the positive moments of the guessing time is to order the alphabet according to a decreasing order of probability

  • We discuss the case of the binary erasure channel (BEC), for which we provide an upper bound by analyzing the greedy dictator function, namely where Bob sends the first bit that has not been erased

Read more

Summary

Introduction

This setup facilitates the use of large-deviation-based information-theoretic techniques, which allowed the authors to characterize the optimal reduction in the guessing-moments as a function of R to the first order in the exponent. This type of argument cannot be applied in our setup of finite number of bits. Bob to describe Y n using nR bits renders the problem amenable to an exact information-theoretic characterization [27] In another related work [28], we have asked about the Boolean function Y n that maximizes the reduction in the sequential mean-squared prediction error of X n and showed that the majority function is optimal in the noiseless case. The model in this paper is different since the noise is applied to the inputs of the function rather than to its output

Problem Statement
Main Results
Guessing Ratio for a General Binary Input Channel
Binary Erasure Channel
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call