Abstract
We consider the standard two-party communication model. The central problem studied in this article is how much can one save in information complexity by allowing an error of e.• For arbitrary functions, we obtain lower bounds and upper bounds indicating a gain that is of order Ω(h(e)) and [EQUATION]. Here h denotes the binary entropy function.• We analyze the case of the two-bit AND function in detail to show that for this function the gain is Θ(h(e)). This answers a question of Braverman et al. [4].• We obtain sharp bounds for the set disjointness function of order n. For the case of the distributional error, we introduce a new protocol that achieves a gain of [EQUATION] provided that n is sufficiently large. We apply these results to answer another of question of Braverman et al. regarding the randomized communication complexity of the set disjointness function.• Answering a question of Braverman [3], we apply our analysis of the set disjointness function to establish a gap between the two different notions of the prior-free information cost. In light of [3], this implies that amortized randomized communication complexity is not necessarily equal to the amortized distributional communication complexity with respect to the hardest distribution.As a consequence, we show that the e-error randomized communication complexity of the set disjointness function of order n is n[Cdisj − Θ(h(e))] + o(n), where Cdisj ≈ 0.4827 is the constant found by Braverman et al. [4].
Highlights
Communication complexity studies the amount of communication needed to compute a function whose inputs are spread among several parties
We show that the ε-error randomized communication complexity of the set disjointness function of order n is n[CDISJ − Θ(h(ε))] + o(n), where CDISJ ≈ 0.4827 is the constant found by Braverman et al (STOC’13)
While communication complexity is concerned with minimizing the amount of communication required for two players to evaluate a function, information complexity is concerned with the amount of information that the communicated bits reveal about the players’ inputs
Summary
Communication complexity studies the amount of communication needed to compute a function whose inputs are spread among several parties. The goal, first studied in [1], is to determine the asymptotic rate of growth of the randomized communication complexity Rε (DISJn) of set disjointness, defined as the smallest number of bits exchanged by the two players in a protocol which computes the function correctly with probability at least 1 − ε on every input. The distributional information complexity ICμ ( f , μ, ε) of a function f with respect to a distribution μ on the inputs is the minimal amount of information that the players need to leak in any protocol that computes f correctly with probability at least 1 − ε when the inputs are drawn according to μ They showed that ICμ ( f , μ, 0) − ICμ ( f , μ, ε) ≤ C( f , μ)h(ε1/8), where C( f , μ) denotes a positive constant which depends only on f and μ.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.