Abstract

In addition to the parallel-distributed nature, recurrent neural networks can be implemented physically by designated hardware and thus have been found broad applications in many fields. In this paper, a special class of recurrent neural network named Zhang neural network (ZNN), together with its electronic realization, is investigated and exploited for online solution of time-varying linear matrix equations. By following the idea of Zhang function (i.e., error function), two ZNN models are proposed and studied, which allow us to choose plentiful activation functions (e.g., any monotonically-increasing odd activation function). It is theoretically proved that such two ZNN models globally and exponentially converge to the theoretical solution of time-varying linear matrix equations when using linear activation functions. Besides, the new activation function, named Li activation function, is exploited. It is theoretically proved that, when using Li activation function, such two ZNN models can be further accelerated to finite-time convergence to the time-varying theoretical solution. In addition, the upper bound of the convergence time is derived analytically via Lyapunov theory. Then, we conduct extensive simulations using such two ZNN models. The results substantiate the theoretical analysis and the efficacy of the proposed ZNN models for solving time-varying linear matrix equations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call