Abstract

In this paper, we introduce a new algorithm called ‘bits-back coding’ that makes stochastic source codes efficient. For a given one-to-many source code, we show that this algorithm can actually be more efficient than the algorithm that always picks the shortest codeword. Optimal efficiency is achieved when codewords are chosen according to the Boltzmann distribution based on the codeword lengths. It turns out that a commonly used technique for determining parameters–maximum-likelihood estimation–actually minimizes the bits-back coding cost when codewords are chosen according to the Boltzmann distribution. A tractable approximation to maximum-likelihood estimation–the generalized expectation-maximization algorithm–minimizes the bits-back coding cost. After presenting a binary Bayesian network model that assigns exponentially many codewords to each symbol, we show how a tractable approximation to the Boltzmann distribution can be used for bits-back coding. We illustrate the performance of bits-back coding using non-synthetic data with a binary Bayesian network source model that produces 260 possible codewords for each input symbol. The rate for bits-back coding is nearly one half of that obtained by picking the shortest codeword for each symbol.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call