Abstract

Given a function f:{0,1}n→{+1,−1}, its Fourier Entropy is defined to be −∑Sfˆ(S)2log⁡fˆ(S)2, where fˆ denotes the Fourier transform of f. In the analysis of Boolean functions, an outstanding open question is a conjecture of Friedgut and Kalai, 1996 [3], called the Fourier Entropy Influence (FEI) Conjecture, asserting that the Fourier Entropy of any Boolean function f is bounded above, up to a constant factor, by the total influence (=average sensitivity) of f.In this paper we give several upper bounds on the Fourier Entropy. We first give upper bounds on the Fourier Entropy of Boolean functions in terms of several complexity measures that are known to be bigger than the influence. These complexity measures include, among others, the logarithm of the number of leaves and the average depth of a parity decision tree. We then show that for the class of Linear Threshold Functions (LTF), the Fourier Entropy is O(n). It is known that the average sensitivity for the class of LTF is Θ(n). We also establish a bound of Od(n1−14d+6) for general degree-d polynomial threshold functions. Our proof is based on a new upper bound on the derivative of noise sensitivity. Next we proceed to show that the FEI Conjecture holds for read-once formulas that use AND, OR, XOR, and NOT gates. The last result is independent of a result due to O'Donnell and Tan [1] for read-once formulas with arbitrary gates of bounded fan-in, but our proof is completely elementary and very different from theirs. Finally, we give a general bound involving the first and second moments of sensitivities of a function (average sensitivity being the first moment), which holds for real-valued functions as well.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call