Abstract

In this paper, a discrete-time Zhang neural network (DTZNN) model, discretized from continuous-time Zhang neural network, is proposed and investigated for performing the online future minimization (OFM). In order to approximate more accurately the 1st-order derivative in computation and discretize more effectively the continuous-time Zhang neural network, a new Taylor-type numerical differentiation formula, together with the optimal sampling-gap rule, is presented and utilized to obtain the Taylor-type DTZNN model. For comparison, Euler-type DTZNN model and Newton iteration, with an interesting link being found, are also presented. Moreover, theoretical results of stability and convergence are presented, which show that the steady-state residual errors of the presented Taylor-type DTZNN model, Euler-type DTZNN model and Newton iteration have a pattern of 0(t3), 0(t2) and 0(t), respectively, with t denoting the sampling gap. Numerical experimental results further substantiate the effectiveness and advantages of the Taylor-type DTZNN model for solving the OFM problem.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.