Abstract
This book is about Information Theory and Rate Distortion Theory specifically pointed toward the problems of communications and compression. Information theory and rate distortion theory were introduced by Shannon in his landmark 1948 paper [1, 2], wherein he presents the fundamental results concerning lossless (noiseless) source coding, channel capacity, and rate distortion theory and demonstrates the critical roles played by entropy and mutual information in establishing these fundamental limits. Shannon revisited fundamental limits on lossy source coding in his 1959 paper [3], wherein he coined the term rate distortion function, proved coding theorems, calculated R(D) for several examples, derived what we now call the Shannon lower bound on R(D), and noted the duality between a rate distortion function and a capacity cost function.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.