Abstract

Shannon's capacity and rate-distortion function, combined with the separation principle, provide tight bounds for the minimum possible distortion in joint source-channel coding. These bounds, however, are usually achievable only in the limit of a large block length. In their 1973 paper, Ziv and Zakai introduced a family of alternative capacity and rate-distortion functions, based on functionals satisfying the data-processing inequality, which potentially give tighter bounds for systems with a small block length. There is a considerable freedom as to how to choose those functionals, and the ways of finding the best possible functionals yielding the best bounds for a given source-channel combination are not specified. We examine recently conjectured high SNR asymptotic expressions for the Ziv-Zakai bounds, based on the Rényi-divergence functional. We derive nonasymptotic bounds on the Ziv-Zakai-Rényi rate-distortion function and capacity for a broad class of sources and additive noise channels, which hold for arbitrary SNR and prove the conjectured asymptotic expressions in the limit of a small distortion/high SNR. The results lead to new bounds on the best achievable distortion in finite dimensional joint source-channel coding. Examples are presented where the new bounds achieve significant improvement upon Shannon's original bounds.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.