Abstract

A code C ⊆ (0, 1}n is (s, L) erasure list-decodable if for every word w, after erasing any s symbols of w, the remaining n - s symbols have at most L possible completions into a codeword of C. Non-explicitly, there exist binary [MATH HERE] erasure list-decodable codes with rate approaching τ and tiny list-size L = [MATH HERE]. Achieving either of these parameters explicitly is a natural open problem (see, e.g., [26, 24, 25]). While partial progress on the problem has been achieved, no prior nontrivial explicit construction achieved rate better than Ω(τ2) or list-size smaller than Ω(1/τ). Furthermore, Guruswami showed no linear code can have list-size smaller than Ω(1/τ) [24]. We construct an explicit binary ((1 - τ)n, L) erasure list-decodable code having rate τ1+γ (for any constant γ > 0 and small τ) and list-size poly(log [MATH HERE]), answering simultaneously both questions, and exhibiting an explicit non-linear code that provably beats the best possible linear code.The binary erasure list-decoding problem is equivalent to the construction of explicit, low-error, strong dispersers outputting one bit with minimal entropy-loss and seed-length. For error e, no prior explicit construction achieved seed-length better than 2log(1/e) or entropy-loss smaller than 2log(1/e), which are the best possible parameters for extractors. We explicitly construct an e-error one-bit strong disperser with near-optimal seed-length (1 + γ) log(1/e) and entropy-loss O(log log 1/e).The main ingredient in our construction is a new (and almost-optimal) unbalanced two-source extractor. The extractor extracts one bit with constant error from two independent sources, where one source has length n and tiny min-entropy O(log log n) and the other source has length O(log n) and arbitrarily small constant min-entropy rate. When instantiated as a balanced two-source extractor, it improves upon Raz's extractor [39] in the constant error regime. The construction incorporates recent components and ideas from extractor theory with a delicate and novel analysis needed in order to solve dependency and error issues that prevented previous papers (such as [27, 9, 13]) from achieving the above results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call