Abstract

Despite the successful use of Gaussian-binary restricted Boltzmann machines (GB-RBMs) and Gaussian-binary deep belief networks (GB-DBNs), little is known about their theoretical approximation capabilities to represent distributions of continuous random variables. In this paper, we address the expressive properties of GB-RBMs and GB-DBNs, contributing theoretical insights to the optimal number of hidden variables. We first treat the GB-RBM’s unnormalized log-likelihood as a sum of a special two-layer feedforward neural network and a negative quadratic term. Then, a series of simulation results are established, which can be used to relate GB-RBMs to general two-layer feedforward neural networks whose expressive properties are much better understood. On this basis, we show that a two-layer ReLU network with all weights in the second layer being 1, along with a negative quadratic term, can approximate all continuous functions. In addition, we provide qualified lower bounds for the number of hidden variables of GB-RBMs required to approximate distributions whose log-likelihood are given by some classes of smooth functions. Moreover, we further study the universal approximation of GB-DBNs with two hidden layers by providing a sufficient number of hidden variables O(ɛ−2) that are guaranteed to approximate any given strictly positive continuous distribution within a given error ɛ. Finally, numerical experiments are carried out to verify some of the proposed theoretical results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.