Abstract

Restricted Boltzmann machines (RBMs) are used to build deep-belief networks that are widely thought to be one of the first effective deep learning neural networks. This paper studies the ability of RBMs to represent distributions over {0,1}n via softplus/hardplus RBM networks. It is shown that any distribution whose density depends on the number of 1's in their input can be approximated with arbitrarily high accuracy by an RBM of size 2n+1 , which improves the result of a previous study by reducing the size from n2 to 2n+1 . A theorem for representing partially symmetric Boolean functions by softplus RBM networks is established. Accordingly, the representational power of RBMs for distributions whose mass represents the Boolean functions is investigated in comparison with that of threshold circuits and polynomial threshold functions. It is shown that a distribution over {0,1}n whose mass represents a Boolean function can be computed with a given margin δ by an RBM of size and parameters bounded by polynomials in n , if and only if it can be computed by a depth-2 threshold circuit with size and parameters bounded by polynomials in n .

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call