Abstract

In 1982, Hopfield proposed a model of neural networks [84], which used two-state threshold “neurons” that followed a stochastic algorithm. This model explored the ability of a network of highly interconnected “neurons” to have useful collective computational properties, such as content addressable memory. However, the model is based on McCulloch-Pitts neurons that are different from real biological neurons and also from the realistic functioning of simple electric circuits. Real neurons have continuous input-output relations and integrative time delays due to capacitance. To overcome such problems, in 1984, Hopfield proposed another continuous time recurrent neural network model with a graded response. It is described by a set of differential equations. This deterministic system has collective properties very close to the earlier stochastic model. Today, this model is well known as the Hopfield model of RNNs and it has found wide applications in various optimisation problems [65,22,107, 182], associative memories, engineering problems, satellite broadcast scheduling problems [64,4], graph partition [190], stereo vision [145], multiuser detector [101], fault detection and isolation [172], affine invariant matching [109], pattern sequence recognition [105], classification [26], etc. The contributions of Hopfield RNN model to the field of neural networks cannot be over-emphasised. In fact, it is the outstanding work of Hopfield that has rekindled research interests in the neural networks from both scientists and engineers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call