Abstract
Recent progress in the runtime analysis of evolutionary algorithms (EAs) has allowed the derivation of upper bounds on the expected runtime of standard steady-state genetic algorithms (GAs). These upper bounds have shown speed-ups of the GAs using crossover and mutation over the same algorithms that only use mutation operators (i.e., steady-state EAs) both for standard unimodal (i.e., OneMax) and multimodal (i.e., Jump) benchmark functions. The bounds suggest that populations are beneficial to the GA as well as higher mutation rates than the default 1/n rate. However, making rigorous claims was not possible because matching lower bounds were not available. Proving lower bounds on crossover-based EAs is a notoriously difficult task as it is hard to capture the progress that a diverse population can make. We use a potential function approach to prove a tight lower bound on the expected runtime of the (2+1) GA for OneMax for all mutation rates c/n with c < 1.422. This provides the last piece of the puzzle that completes the proof that larger population sizes improve the performance of the standard steady-state GA for OneMax for various mutation rates, and it proves that the optimal mutation rate for the (2+1) GA on OneMax is (sqrt{97}-5)/(4n) approx 1.2122/n.
Highlights
The runtime analysis of randomized search heuristics like evolutionary algorithms (EAs), simulated annealing, ant colony optimization and estimation-of-distribution algorithms is a young and active subfield in algorithm research that has produced remarkable results in the last 20 years [2,11,15,23]
√ c = 97 − 5 ≈ 1.21221445, we identify this as the optimal mutation rate for the (2+1) genetic algorithms (GAs) within the range of rates covered by Theorem 2
Proving lower bounds for crossover-based GAs is a notoriously hard problem. We have provided such a lower bound for the (2+1) GA on OneMax through a careful analysis of a potential function that captures both the current best fitness and the potential for finding improvements through crossover combining different “building blocks” of good solutions
Summary
The runtime analysis of randomized search heuristics like evolutionary algorithms (EAs), simulated annealing, ant colony optimization and estimation-of-distribution algorithms is a young and active subfield in algorithm research that has produced remarkable results in the last 20 years [2,11,15,23]. Dang et al [7] proved that for sufficiently large population sizes, the (μ+1) GA is at least a linear factor faster than the best algorithm using only standard bit mutation for the Jump benchmark function They showed that crossover may help algorithms to escape more quickly from local optima. A major difficulty in proving rigorous lower bounds for populations with crossover is to find a way to aggregate the state of the algorithm such that it accurately captures the current distance from the optimum, and the potential improvements of the crossover operator. These advancements could be very big if the parents have a large. Once the potential is appropriately bounded, we can use standard drift analysis arguments to bound the expected time from below
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.