Abstract
Many different types of multi-objective optimization problems, e.g. multi-modal problems and large-scale problems, have been solved with high performance by numbers of tailored multi-objective evolutionary algorithms. Little attention has been paid on sparse optimization problems, whose most decision variables are zero in the Pareto optimal solution set. Most recently, algorithms for solving sparse problems have been developed rapidly, and many sparse optimization problems in machine learning, such as the search for lightweight neural networks, can be solved with the help of multi-objective evolutionary algorithms. In this paper, we introduce a sparse truncation operator which uses the accumulative gradient value as a criterion for setting a decision variable to zero. In addition, to balance the exploration and exploitation, a cluster-based competitive particle swarm optimizer is proposed, which takes advantage of both particle swarm optimization and competitive swarm optimizer to search efficiently and escape from local optima. Consequently, aiming at solving sparse multi-objective optimization problems, a novel cluster-based competitive particle swarm optimizer with a sparse truncation operator is proposed, and experimental results show that the proposed algorithm outperforms its peers on sparse test instances and neural network training tasks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.