Abstract

Feature selection concerns the task of finding the subset of features that are most relevant to some specific problem in the context of machine learning. By selecting proper features, one can reduce the computational complexity of the learned model, and to possibly enhance its effectiveness by reducing the well-known overfitting. During the last years, the problem of feature selection has been modeled as an optimization task, where the idea is to find the subset of features that maximize some fitness function, which can be a given classifier’s accuracy or even some measure concerning the samples’ separability in the feature space, for instance. In this paper, we introduced Geometric Semantic Genetic Programming (GSGP) in the context of feature selection, and we experimentally showed it can work properly with both conic and non-conic fitness landscapes. We observed that there is no need to restrict the feature selection modeling into GSGP constraints, which can be quite useful to adopt the semantic operators to a broader range of applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call