Abstract

Stability plays an important role in both control theory and system identification. Furthermore, the stability issue is of crucial importance in relation to training algorithms adjusting the parameters of neural networks. If the predictor is unstable for certain choices of neural model parameters, serious numerical problems can occur during training. Stability criteria should be universal, applicable to as broad a class of systems as possible and at the same time computationally efficient. The majority of well-known approaches are based on Lyapunov’s method [163, 77, 164, 165, 166, 167]. Fang and Kincaid applied the matrix measure technique to study global exponential stability of asymmetrical Hopfield type networks [168]. Jin et. al [169] derived sufficient conditions for absolute stability of a general class of discrete-time recurrent networks by using Ostrowski’s theorem. Recently, global asymptotic as well exponential stability conditions for discrete-time recurrent networks with globally Lipschitz continuous and monotone nondecreasing activation functions were introduced by Hu and Wang [170]. The existence and uniqueness of an equilibrium were given as a matrix determinant problem. Unfortunately, most of the existing results do not consider the stabilization of the network during training. They allow checking the stability of the neural model only after training it. Literature about the stabilization of neural network during training is rather scarce. Jin and Gupta proposed two training methods for a discrete-time dynamic network: multiplier and constrained learning rate algorithms. Both algorithms utilized stability conditions derived by using Lyapunov’s first method and Gersgorin’s theorem. In turn, Suykens et al. [172] derived stability conditions for recurrent multi-layer network using linearisation, robustness analysis of linear systems under non-linear perturbations and matrix inequalities. The elaborated conditions have been used to constrain the dynamic backpropagation algorithm. These solutions, however, are devoted to globally recurrent networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call