Abstract
We study the dynamics of a symmetric analog neural network with a parallel update rule that averages over M previous time steps. We show that convergence to a fixed-point attractor can be guaranteed by a simple criterion that limits the neuron's gain (maximum slope of the neuron transfer function) to a value proportional to M. A global stability analysis based on a new Liapunov function is presented. The analysis generalizes a previous result for M=1, i.e., standard parallel updating. Multistep updating allows oscillation-free parallel dynamics for networks that have period-2 limit cycles under standard parallel updating. Results are applied to associative memory networks based on the Hebb and pseudoinverse learning rules. In addition, we present a simple analysis of convergence times showing that the number of iterations required for a multistep neural network to converge to a fixed point increases at a rate proportional to M when all other network parameters are held fixed. However, because increasing M allows the gain to be increased without inducing oscillation, in some instances using a larger M can yield a shorter convergence time when the neuron gain is also optimally adjusted.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have