Abstract

Many different types of multi-objective optimization problems, e.g. multi-modal problems and large-scale problems, have been solved with high performance by numbers of tailored multi-objective evolutionary algorithms. Little attention has been paid on sparse optimization problems, whose most decision variables are zero in the Pareto optimal solution set. Most recently, algorithms for solving sparse problems have been developed rapidly, and many sparse optimization problems in machine learning, such as the search for lightweight neural networks, can be solved with the help of multi-objective evolutionary algorithms. In this paper, we introduce a sparse truncation operator which uses the accumulative gradient value as a criterion for setting a decision variable to zero. In addition, to balance the exploration and exploitation, a cluster-based competitive particle swarm optimizer is proposed, which takes advantage of both particle swarm optimization and competitive swarm optimizer to search efficiently and escape from local optima. Consequently, aiming at solving sparse multi-objective optimization problems, a novel cluster-based competitive particle swarm optimizer with a sparse truncation operator is proposed, and experimental results show that the proposed algorithm outperforms its peers on sparse test instances and neural network training tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call