Abstract

In‐sensor computing architecture has a great advantage especially in massive data sampling, transfer, and processing compared with the separated intelligent sensor systems. However, most of the in‐sensor computing device is proposed based on the traditional neural network model, where the synapse performs linear multiplication of input and weight. This approach fails to make the most use of the nonlinearity of in‐sensor computing devices. Therefore, in this article, first a modified feedforward neural network model with the nonlinear in‐sensor computing synapse (NSCS) located at the input layer is presented, and the backpropagation (BP) algorithm is modified to train the network. Then, the nonlinear characteristics of the NSCS composed of the spin‐transfer‐torque magnetic tunnel junction (STT–MTJ) devices and simple complementary metal‐oxide‐semiconductor (CMOS) circuit are analyzed. Based on the nonlinear response of STT–MTJ NSCS, the small‐scale network with NSCS synapse is experimented on the Modified National Institute of Standards and Technology dataset and compared with the traditional network of the same network size. In the simulation result, it is shown that better performance can be achieved with the STT–MTJ NSCS, including a 2–15 times improvement in convergence speed and a 2.5%–5.1% increase in accuracy.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.