Abstract

Subset selection plays an important role in the field of evolutionary multiobjective optimization (EMO). Especially, in an EMO algorithm with an unbounded external archive (UEA), subset selection is an essential post-processing procedure to select a prespecified number of solutions as the final result. In this article, we discuss the efficiency of greedy subset selection for the hypervolume, inverted generational distance (IGD), and IGD plus (IGD+) indicators. Greedy algorithms usually efficiently handle the subset selection. However, when a large number of solutions are given (e.g., subset selection from tens of thousands of solutions in a UEA), they often become time consuming. Our idea is to use the submodular property, which is known for the hypervolume indicator, to improve their efficiency. First, we prove that the IGD and IGD+ indicators are also submodular. Next, based on the submodular property, we propose an efficient greedy inclusion algorithm for each indicator. We demonstrate through computational experiments that the proposed algorithms are much faster than the standard greedy subset selection algorithms. The proposed algorithms also help the research on performance indicators.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call