Abstract

Many difficult combinatorial optimization problems arising from science and technology are often difficult to solve exactly. Hence a great number of approximate algorithms for solving combinatorial optimization problems have been developed (Reeves, 1993, Zanakis et al., 1989). Hopfield and Tank (1985) applied the continuous-time, continuous-output Hopfield neural network (CTCO-HNN) to TSP, thereby initiating a new approach to optimization problems. But the Hopfield neural network is often trapped in local minima because of its gradient descent property. A number of modifications have been done on Hopfield neural networks for escaping from local minima. So far, incorporating chaos into the Hopfield neural network has been moved to be a successful approach to improving the convergent property of the HNNs. In this paper, we first review three chaotic neural network models, and then propose a novel approach to chaotic simulated annealing. Second, we apply all of them to a 10 city TSP. The time evolutions of energy functions and outputs of neurons for each model are given. The features and effectiveness of four methods are discussed and evaluated according to the simulation results. We conclude that the proposed neural network with simulated annealing has more powerful ability to obtain global minima than any other chaotic neural network model when applied to difficult combinatorial optimization problems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.