Abstract

Artificial intelligence has surged forward with the advent of generative models, which rely heavily on stochastic computing architectures enhanced by true random number generators with adjustable sampling probabilities. In this study, we develop spin-orbit torque magnetic tunnel junctions (SOT-MTJs), investigating their sigmoid-style switching probability as a function of the driving voltage. This feature proves to be ideally suited for stochastic computing algorithms such as the restricted Boltzmann machines (RBM) prevalent in pretraining processes. We exploit SOT-MTJs as both stochastic samplers and network nodes for RBMs, enabling the implementation of RBM-based neural networks to achieve recognition tasks for both handwritten and spoken digits. Moreover, we further harness the weights derived from the preceding image and speech training processes to facilitate cross-modal learning from speech to image generation. Our results clearly demonstrate that these SOT-MTJs are promising candidates for the development of hardware accelerators tailored for Boltzmann neural networks and other stochastic computing architectures.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call