Abstract

ABSTRACTGradient-based Adaptive Stochastic Search (GASS) is a new stochastic search optimization algorithm that has recently been proposed. It iteratively searches promising candidate solutions through a population of samples generated from a parameterized probabilistic model on the solution space, and updates the parameter of the probabilistic model based on a direct gradient method. Under the framework of GASS, we propose two discrete optimization algorithms: discrete Gradient-based Adaptive Stochastic Search (discrete-GASS) and annealing Gradient-based Adaptive Stochastic Search (annealing-GASS). In discrete-GASS, we transform the discrete optimization problem into a continuous optimization problem on the parameter space of a family of independent discrete distributions, and apply a gradient-based method to find the optimal parameter, such that the corresponding distribution has the best capability to generate optimal solution(s) to the original discrete problem. In annealing-GASS, we use a Boltzmann distribution as the parameterized probabilistic model, and propose a gradient-based temperature schedule that changes adaptively with respect to the current performance of the algorithm. We show convergence of both discrete-GASS and annealing-GASS under appropriate conditions. Numerical results on several benchmark optimization problems and the traveling salesman problem indicate that both algorithms perform competitively against a number of other algorithms, including model reference adaptive search, the cross-entropy method, and multi-start simulated annealing with different temperature schedules.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call