Abstract

A Restricted Boltzmann Machine (RBM) is proposed with an energy function which we show results in hidden node activation probabilities which match the activation rule of neurons in a Gaussian synapse neural network. This makes the proposed RBM a potential tool in pre-training a Gaussian synapse network with a deep architecture, in a similar way to how RBMs have been used in a greedy layer wise pre-training procedure for deep neural networks with scalar synapses. Using experimental examples, we investigate the training characteristics of this form of RBM and discuss its suitability for pre-training of a deep Gaussian synapse network. While this is the most direct route to a deep Gaussian synapse network, we explain and discuss a number of issues found in using the proposed form of RBM in this way, and suggest possible soutions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call