Abstract

We analyze the asymptotics of the normalized remaining uncertainty of a source when a compressed or hashed version of it and correlated side information is observed. For this system, commonly known as Slepian–Wolf source coding, we establish the optimal (minimum) rate of compression of the source to ensure that the remaining uncertainties vanish. We also study the exponential rate of decay of the remaining uncertainty to zero when the rate is above the optimal rate of compression. In this paper, we consider various classes of random universal hash functions. Instead of measuring remaining uncertainties using traditional Shannon information measures, we do so using two forms of the conditional Renyi entropy. Among other techniques, we employ new one-shot bounds and the moments of type class enumerator method (see Merhav) for these evaluations. We show that these asymptotic results are generalizations of the strong converse exponent and the error exponent of the Slepian–Wolf problem under maximum a posteriori decoding.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.