Abstract
We explore the connections between rate distortion/lossy source coding and deep learning models, the Restricted Boltzmann Machines (RBMs) and Deep Belief Networks (DBNs). We show that rate distortion is a function of the RBM log partition function and that RBM/DBN can be used to learn the rate distortion approaching posterior as in the Blahut-Arimoto algorithm. We propose an algorithm for lossy compressing of binary sources. The algorithm consists of two stages, a training stage that learns the posterior with training data of the same class as the source, and a compression/reproduction stage that is comprised of a lossless compression and a lossless reproduction. Theoretical results show that the proposed algorithm achieves the optimum rate distortion function for stationary ergodic sources asymptotically. Numerical experiments show that the proposed algorithm outperforms the reported best results.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.