This paper investigates the convergence theorems that are associated with a Discrete Hopfield Neural Network (DHNN) with delay. We present two updating rules, one for serial mode and the other for parallel mode. The speed of convergence of these proposed updating rules is faster than all of the existing updating rules. It has been proved in this paper that a DHNN with delay will converge to a stable state when operating in a serial mode if the matrix of weights of the no-delay term is symmetric. In addition, it has been proved that they will converge to a stable state when operating in a parallel mode if the matrix of weights of the no-delay term is a symmetric and non-negative definite matrix. The condition for convergence of a DHNN without delay can been relaxed from the need to have a symmetric matrix to an even weaker condition of having a quasi-symmetric matrix. The results in this paper extend both the existing results concerning the convergence of a DHNN without delay and our previous findings. By means of the new network structure and its convergence theorems, we propose a local searching algorithm for combinatorial optimization. We also relate the maximum value of a bivariate energy function to the stable states of a DHNN with delay, which generalizes Hopfield's energy function. Moreover, for the serial model we give the relationship between the convergence of the energy function and the convergence of the corresponding network. One application is presented to demonstrate the higher rate of convergence and the accuracy of the classification using our algorithm.
Read full abstract