Abstract

This paper investigates a new learning algorithm (LF I) based on Lyapunov function for the training of feedforward neural networks. The proposed algorithm has an interesting parallel with the popular back-propagation algorithm where the fixed learning rate of the back-propagation algorithm is replaced by an adaptive learning rate computed using convergence theorem based on Lyapunov stability theory. Next, the proposed algorithm is modified (LF II) to allow smooth search in the weight space. The performance of the proposed algorithms is compared with back-propagation algorithm and extended Kalman filtering(EKF) on two bench-mark function approximations, XOR and 3-bit parity. The comparisons are made in terms of learning iterations and computational time required for convergence. It is found that the proposed algorithms (LF I and II) are faster in convergence than other two algorithms to attain same accuracy. Finally the comparison is made on a system identification problem where it is shown that the proposed algorithms can achieve better function approximation accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call