Abstract
Whenever digital communications is subject to constraints in delay and complexity, some residual redundancy remains after source coding. Shannon's separation theorem does not hold strictly and hence joint source-channel decoding is beneficial. In this paper we discuss two competing approaches exploiting residual redundancy in terms of (auto)-correlation. Both will be analyzed with respect to their impact on quantization noise and channel distortion. The first approach exploits redundancy at the receiver for minimum mean squared error estimation of source codec parameters. The second approach exploits correlation at the transmitter by predictive encoding. The prediction gain allows to reduce quantization noise or bitrate. The spare bitrate can be used for specific error protection and hence reduction of channel distortion. But otherwise, less redundancy is utilizable at the receiver for parameter estimation. Simulations will show that the predictive encoding scheme improves quality in a wide range of channel conditions, but in contrast to Shannon's separation theorem the utilization of correlation at the receiver side is beneficial in very bad channel conditions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.