A Lyapunov function is constructed for the unsupervised learning equations of a large class of neural networks. These networks have a single layer of adjustable connections; units in the output layer are recurrently connected with fixed symmetric weights. The constructed function is similar in form to that derived by Cohen-Grossberg and Hopfield. Two theorems are proved regarding the location of stable equilibria in the limit of high gain transfer functions. The analysis is applied to the soft competitive learning networks of Amari and Takeuchi.
Read full abstract