Abstract

With the advent of memristors, analog artificial neural networks are closer than ever. Neural computing is growing as a topic of research. In the context of analog artificial neural networks, the purpose of this research is to verify that a perceptron could gain a discrete memory from implementing a hysteresis loop in the activation function. The discrete memory is represented by the difference path of the hysteresis activation function that took from logic 1 to logic 0. To write to the memory, the input to the hysteresis loop would have to exceed threshold. To read the stored value, the input would have to be between the thresholds of the hysteresis function. In order to verify the perceptron’s memory, a network with manually chosen weights is selected which acts as a shift register. The components of this network are assembled in a circuit simulation program. Functionally, the network receives two inputs: a data signal and an enable signal. The output of the network is a time-shifted version of previous input signals. A system whose output is a time-shifted version of the previous inputs is considered to have memory.

Highlights

  • Artificial neural networks (ANNs) have been a known data structure in computing since the late 1940’s [1]

  • This paper explores the implications that using an activation function with hysteresis in a perceptron

  • The system is a series of perceptrons whose activation function is a hysteresis loop

Read more

Summary

Introduction

Artificial neural networks (ANNs) have been a known data structure in computing since the late 1940’s [1]. The idea would be to take these operations are currently being performed virtually and accelerate them by distributing the labor with application specific perceptron hardware This would revolutionize the neural network by allowing a network to be scaled larger than ever before. The voltage transfer characteristic of the device has the property of Lissajous hysteresis [9] These properties has combined in an analog neural network to function as changeable weights using a memristor bridge [12]. The general idea is that there is a memory cell, which overwritten, based on a combination of input signals In this case, a shift register implemented in analog in the form of a Schmitt Trigger for the activation function.

Perceptron
Schmitt Trigger
Schmitt Trigger as a Memory Cell
Making the Network Turing Complete
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.