Abstract

Various evolutionary algorithms (EAs) have been proposed to address feature selection (FS) problems, in which a large number of fitness evaluations are needed. With the rapid growth of data scales, the fitness evaluation becomes time consuming, which makes FS problems expensive optimization problems. Surrogate-assisted EAs (SAEAs) have been widely used to solve expensive optimization problems. However, the SAEAs still face difficulties in solving expensive FS problems due to their high-dimensional discrete decision variables. To address this issue, we propose an SAEA with parallel random grouping for expensive FS problems, in which three main components consist. First, a constraint-based sampling strategy is proposed, which considers the influence of the constraint boundary and the number of selected features. Second, a high-dimensional FS problem is randomly divided into several low-dimensional subproblems. Surrogate models are then constructed in these low-dimensional decision spaces. After that, all the subproblems are optimized in parallel. The process of random grouping and parallel optimization continues until the termination condition is met. Finally, a final solution is chosen from the best solution in the historical search and the best solution in the last population using a random, distance-, or voting-based method. Experimental results show that the proposed algorithm generally outperforms traditional, ensemble, and evolutionary FS methods on 14 datasets with up to 10 000 features, especially when the required number of real fitness evaluations is limited.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call