Abstract

This chapter considers a class of neural networks that have a recurrent structure, including Grossberg network, Hopfield network, and cellular neural networks. The Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982. It consists of a set of neurons and a corresponding set of unit time delays, formatting a multiple-loop feedback system. There are three components to the Grossberg network: Layer 1, Layer 2, and the adaptive weights. Layer 1 is a rough model of the operation of the retina, while Layer 2 represents the visual cortex. Cellular neural networks contain linear and nonlinear circuit elements, which typically are linear capacitors, linear resistors, linear and nonlinear controlled sources, and independent sources. The chapter also describes the mathematical model of a nonlinear dynamic system, and discusses some of the important issues involved in neurodynamics.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.