Abstract

GAs started with generic mutation and crossover operators, but over the years specialized representations and/or operators designed specifically for a given domain or problem, such as TSP, proved the most effective. In this paper, we define a class of new GA operators which automatically adjust for each problem. The adjustments or instantiations are based on the domain model presented to the operators in the form of Bayesian Network, as generated in the hierarchical Bayesian Optimization Algorithm (hBOA). We then show that these operators outperform standard random operators as long as the models are of sufficient quality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call