Abstract

Restricted Boltzmann machines (RBMs) have actively been studied in the field of deep neural networks. RBMs are stochastic artificial neural networks that can learn a probability distribution of input datasets. However, they require considerable computational resources, long processing times and high power consumption due to huge number of random number generation to obtain stochastic behavior. Therefore, dedicated hardware implementation of RBMs is desired for consumer applications with low-power devices. To realize hardware implementation of RBMs in a massively parallel manner, each unit must include random number generators (RNGs), which occupy huge hardware resources. In this paper, we propose a hardware-oriented RBM algorithm that does not require RNGs. In the proposed method, as a random number, we employ underflow bits obtained from the calculation process of the firing probability. We have developed a software implementation of fixed-point RBMs to evaluate the proposed method. Experimental results show that a 16-bit fixed-point RBM can be trained by the proposed method, and the underflow bits can be used as random numbers in RBM training.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call