As datasets grow in dimension and sample size, feature selection becomes increasingly important in machine learning. Features are often associated with multiple tasks, so adopting a multi-task optimization framework in feature selection can improve its classification performance. Multifactor optimization provides a powerful evolutionary multi-tasking paradigm capable of simultaneously handling multiple related optimization tasks. Taking inspiration from these, this article proposes a parameter adaptive multifactor feature selection algorithm (AMFEA). To help the algorithm escape from local optima, AMFEA uses a local search strategy to assist the algorithm in finding the global optimum. In addition, AMFEA has designed an adaptive knowledge transfer parameter matrix that dynamically adjusts parameter sizes based on the population’s fitness to control the frequency of knowledge transfer between tasks. This effectively transfers knowledge between different tasks and helps the algorithm converge quickly. Experimental results on 18 high-dimensional datasets show that AMFEA significantly improves classification accuracy compared with evolutionary algorithms and traditional feature selection methods.
Read full abstract