Abstract
We analyze the transition and convergence properties of genetic algorithms (GAs) applied to fitness functions perturbed concurrently by additive and multiplicative noise. Both additive noise and multiplicative noise are assumed to take on finitely many values. We explicitly construct a Markov chain that models the evolution of GAs in this noisy environment and analyze it to investigate the algorithms. Our analysis shows that this Markov chain is indecomposable; it has only one positive recurrent communication class. Using this property, we establish a condition that is both necessary and sufficient for GAs to eventually (i.e., as the number of iterations goes to infinity) find a globally optimal solution with probability 1. Similarly, we identify a condition that is both necessary and sufficient for the algorithms to eventually with probability 1 fail to find any globally optimal solution. Our analysis also shows that the chain has a stationary distribution that is also its steady-state distribution. Based on this property and the transition probabilities of the chain, we compute the exact probability that a GA is guaranteed to select a globally optimal solution upon completion of each iteration.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have