Abstract

We consider sources and channels with memory observed through erasure channels. In particular, we examine the impact of sporadic erasures on the fundamental limits of lossless data compression, lossy data compression, channel coding, and denoising. We define the erasure entropy of a collection of random variables as the sum of entropies of the individual variables conditioned on all the rest. The erasure entropy measures the information content carried by each symbol knowing its context. The erasure entropy rate is shown to be the minimal amount of bits per erasure required to recover the lost information in the limit of small erasure probability. When we allow recovery of the erased symbols within a prescribed degree of distortion, the fundamental tradeoff is described by the erasure rate-distortion function which we characterize. We show that in the regime of sporadic erasures, knowledge at the encoder of the erasure locations does not lower the rate required to achieve a given distortion. When no additional encoded information is available, the erased information is reconstructed solely on the basis of its context by a denoiser. Connections between erasure entropy and discrete denoising are developed. The decrease of the capacity of channels with memory due to sporadic memoryless erasures is also characterized in wide generality.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.