Abstract

Artificial intelligence based on artificial neural networks, which are originally inspired by the biological architectures of the human brain, has mostly been realized using software but executed on conventional von Neumann computers, where the so-called von Neumann bottleneck essentially limits the executive efficiency due to the separate computing and storage units. Therefore, a suitable hardware platform that can exploit all the advantages of brain-inspired computing is highly desirable. Based upon micromagnetic simulation of the magnetization dynamics, we demonstrate theoretically and numerically that recurrent neural networks consisting of as few as 40 magnetic tunnel junctions can generate and recognize periodic time series after they are trained with an efficient algorithm.

Highlights

  • In the past decade, significant progress has been made in artificial intelligence, where advanced algorithms using artificial neural networks (ANNs) have been successfully applied in image recognition, data classification, and other areas.1,2 As an impressive example, the deep learning technique has shown an overwhelming advantage in the confrontation between a human and computer in the game of go.3–5 ANNs resulting from simulating biological architectures of the human brain possess the intrinsic advantages of the brain including parallel computation, distributed storage, low energy consumption, etc

  • An example has been shown in the devices of random number generation, where the most energy-efficient implementation of the CMOS circuit consumes 2.9 pJ/bit and the circuit area of 4004 μm2.12 The device based on magnetic tunnel junctions (MTJs) only costs 20 fJ/bit and 2 μm2 in area

  • The magnetization dynamics can be well described by the phenomenological Landau–Lifshitz–Gilbert equation,22,23 which has been examined in the past half century in the research communities of magnetism and spintronics

Read more

Summary

Introduction

Significant progress has been made in artificial intelligence, where advanced algorithms using artificial neural networks (ANNs) have been successfully applied in image recognition, data classification, and other areas.1,2 As an impressive example, the deep learning technique has shown an overwhelming advantage in the confrontation between a human and computer in the game of go.3–5 ANNs resulting from simulating biological architectures of the human brain possess the intrinsic advantages of the brain including parallel computation, distributed storage, low energy consumption, etc. Reservoir computing is suitable for encoding time series,28,29 in which the reservoir is physically a recurrent neural network (RNN).30 The sparse and usually random connections among the neurons in the RNN ensure the capability to describe sufficiently complex functions.31 The relatively simple structure is another advantage of the RNN in the hardware implementation.32,33 In this paper, we report a spintronic realization of RNNs with MTJs, which were used as the basic units of spin-transfer torque magnetic random access memory.

Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.