Abstract

The gradient descent is one of the most common methods to modify the gradient parameters of the neural network, however, it may fall into the local optimum in the training process. As we all know, genetic algorithm can find the global optimal solution through global search, but it may not be efficient, always takes a lot of running time. These two methods are complementary in the running time and the cost, which inspired us to consider the combination of gradient descent and genetic algorithm. In this paper, a multi-start combinatorial optimization method based on the genetic algorithm and the gradient descent is proposed. First, the initial point is selected through the genetic algorithm. Then the combination of multi-start and gradient descent is used, which can quickly achieve the global search and improve the performance with relatively less running time and cost. Compared with the traditional genetic algorithm and gradient descent method, this proposed algorithm has a better performance in computing the global optimum efficiently.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call