Abstract

Recently large-scale sparse multi-objective optimization problems are increasingly concerned by researchers. Different from large-scale multi-objective optimization problems, most of the decision variables (Decs) of the large-scale sparse multi-objective optimization problems are equal to zero. Among the existing large-scale sparse evolutionary algorithms, SparseEA and SparseEA2 dynamically mask some of the real decision variables to zero by setting Mask which can accelerate convergence. When determining Mask updates, the SparseEA and SparseEA2 algorithms both use a static fitness. However, static fitness is limited by the number of iterations so that it is difficult to cover the global information. To address this issue, we design an adaptive fitness for Mask updates. Moreover, we increase the number of decision variables per flip according to the number of decision variables, and gradually decrease the flip probability as the number of iterations increases. In the case of real Decs cross, we only cross the real Decs with the same Mask as the parents. We experiment on eight benchmark problems and three real-world application problems, and the simulation results show that our algorithm is significantly more efficient than the other four currently available sparse large-scale multi-objective algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call