Abstract
We examine the convergence properties of genetic algorithms (GAs) in a wide variety of noisy environments where fitness perturbation can occur in any form for example, fitness functions can be concurrently perturbed by additive and multiplicative noise. We reveal the convergence properties of such GAs by constructing and analyzing a Markov chain that explicitly models the evolution of the algorithms. We compute the one-step transition probabilities of the chain and show that the chain has only one positive recurrent communication class. Based on this property, we establish a condition that is necessary and sufficient for GAs to eventually find a globally optimal solution with probability 1. We also identify a condition that is necessary and sufficient for GAs to eventually with probability 1 fail to find any globally optimal solution. Our analysis also shows that in all the noisy environments, the chain converges to stationarity: It has a unique stationary distribution that is also its steady-state distribution. We describe how this property and the one-step transition probabilities of the chain can be used to compute the exact probability that a GA is guaranteed to select a globally optimal solution upon completion of each iteration.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have