Abstract

Stochastic optimization by learning and using probabilistic models has received an increasing amount of attention over the last few years. Algorithms within this field estimate the probability distribution of a selection of the available solutions and subsequently draw more samples from the estimated probability distribution. The resulting algorithms have displayed a good performance on a wide variety of single-objective optimization problems, both for binary as well as for real-valued variables. Mixture distributions offer a powerful tool for modeling complicated dependencies between the problem variables. Moreover, they allow for elegant and parallel exploration of a multi-objective front. This parallel exploration aids the important preservation of diversity in multi-objective optimization. In this paper, we propose a new algorithm for evolutionary multi-objective optimization by learning and using probabilistic mixture distributions. We name this algorithm Multi-objective Mixture-based Iterated Density Estimation Evolutionary Algorithm ( M ID E A). To further improve and maintain the diversity that is obtained by the mixture distribution, we use a specialized diversity preserving selection operator. We verify the effectiveness of our approach in two different problem domains and compare it with two other well-known efficient multi-objective evolutionary algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call