Abstract

Feature selection (FS) is an essential technique widely applied in data mining. Recent studies have shown that evolutionary computing (EC) is very promising for FS due to its powerful search capability. However, most existing EC-based FS methods use a length-fixed encoding to represent feature subsets. This inflexible encoding turns ineffective when high-dimension data are handled, because it results in a huge search space, as well as a large amount of training time and memory overhead. In this article, we propose a length-adaptive genetic algorithm with Markov blanket (LAGAM), which adopts a length-variable individual encoding and enables individuals to evolve in their own search space. In LAGAM, features are rearranged decreasingly based on their relevance, and an adaptive length changing operator is introduced, which extends or shortens an individual to guide it to explore in a better search space. Local search based on Markov blanket (MB) is embedded to further improve individuals. Experiments are conducted on 12 high-dimensional datasets and results reveal that LAGAM performs better than existing methods. Specifically, it achieves a higher classification accuracy by using fewer features.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call