Abstract
Restricted Boltzmann machines (RBMs) have actively been studied in the field of deep neural networks. RBMs are stochastic artificial neural networks that can learn a probability distribution of input datasets. However, they require considerable computational resources, long processing times and high power consumption due to huge number of random number generation to obtain stochastic behavior. Therefore, dedicated hardware implementation of RBMs is desired for consumer applications with low-power devices. To realize hardware implementation of RBMs in a massively parallel manner, each unit must include random number generators (RNGs), which occupy huge hardware resources. In this paper, we propose a hardware-oriented RBM algorithm that does not require RNGs. In the proposed method, as a random number, we employ underflow bits obtained from the calculation process of the firing probability. We have developed a software implementation of fixed-point RBMs to evaluate the proposed method. Experimental results show that a 16-bit fixed-point RBM can be trained by the proposed method, and the underflow bits can be used as random numbers in RBM training.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.