Abstract

This chapter presents studies on a general class of neural network models, that is, a randomly diluted network of McColluch–Pitts neurons that interact via Hebbian-type connections. It derives and solves exact dynamic equations that describe how the system evolves from its initial state under various conditions. The motivation for these studies is the incorporation of neurobiological features into theoretical models and studying their effects on the emergent network performance of associative memory. Specifically, it studies interactions of neural networks, a variable threshold, hysteresis at the level of a single neuron, and higher-order synaptic interactions in the presence of noise. The models discussed in the chapter simulate a wide variety of phenomena such as distraction, concentration, selective attention, (de)sensitization, anesthesia, noise resistance, oscillations, chaos, crisis, and multiplicity. The chapter presents a general discussion on noise in neurons and then gives a brief review on neural network models discussed by Little and Hopfield, together with a version studied by Derrida, Gardner, and Zippelius.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call