Stability of recurrent neural network N: x ̇ i =x i + ∑ j=1 n e ij w ij 8 j (x i )+I i , i=1,2,…,n y i =g i (x i ) with learning rule L: e ̇ ij =uh ij (e,y,y ∗ ), i,j=1,2,…,n is analyzed using the concept of equilibrium manifold. We shall show that the concept provides an ideal setting for the formulation of a learning rule L that adaptively teaches network N to acquire a desired pattern y ∗ as one of its asymptotically stable equilibria. Connective stability of a moving equilibrium in the composite two-time-scale system N & L is established for bounded interconnection weights e ij w ij and a sufficiently small learning rate μ. The conditions for connective stability, which are derived using the M-matrices and the concept of vector Liapunov functions, are suitable for studying the design trade-offs between the bounds on the nominal weights w ij , the shape of the sigmoid function g i (x i ), the learning rule h (e, y, y ∗ ), and rate μ, as well as the size of the stability region containing y ∗ .
Read full abstract