Abstract

The Restricted Boltzmann Machine (RBM), a special case of general Boltzmann Machines and a typical Probabilistic Graphical Models, has attracted much attention in recent years due to its powerful ability in extracting features and representing the distribution underlying the training data. A most commonly used algorithm in learning RBMs is called Contrastive Divergence (CD) proposed by Hinton, which starts a Markov chain at a data point and runs the chain for only a few iterations to get a low variance estimator. However, when referring to a high-order RBM, since there are interactions among its visible layers, the gradient approximation via CD learning usually becomes far from the log-likelihood gradient and even may cause CD learning to fall into an infinite loop with high reconstruction error. In this paper, a new algorithm named Cyclic Contrastive Divergence (CCD) is introduced for learning high-order RBMs. Unlike the standard CD algorithm, CCD updates the parameters according to each visible layer in turn, by borrowing the idea of Cyclic Block Coordinate Descent method. To evaluate the performance of the proposed CCD algorithm, regarding to high-order RBMs learning, both algorithms CCD and standard CD are theoretically analyzed, including convergence, estimate upper bound and both biases comparison, from which the superiority of CCD learning is revealed. Experiments on MNIST dataset for the handwritten digit classification task are performed. The experimental results show that CCD is more applicable and consistently outperforms the standard CD in both convergent speed and performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call