Abstract

Several questions related to the complexity of communication over channels with noise are addressed. We compare some of our results to wellknown results in information theory. In particular we compare the following two problems. Assuming that the communication channel between two processors P1 and P2 makes an error with probability ε≫0, the identification problem is to determine whether P1 and P2 have the same n-bit integer. The decoding problem is for P2 to determine the n-bit integer of P1. For the latter problem we show that given any arbitrarily large constant λ≫0, there exists an ε, 0≪ε≪1/2, for which no scheme requiring less than λn bits of communication can guarantee (for large n) any bound q≪1 on the error probability. On the other hand, given any arbitrarily small constant γ≫0 and any ε, 0≪ε≪1/2, the identification problem can be solved with (1+γ)n bits of (one-way) communication with an error probability bounded by c2-αn, where c and α are positive constants. These techniques are extended to other problems, and a one-bit output Boolean function is shown to exhibit a similar behavior to that of the decoding problem regardless of how the input bits are partitioned among the two processors.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.