Abstract

Feature selection aims at selecting relevant features from the original feature set, but these features do not have the same degree of importance. This can be achieved by feature weighting, which is a method for quantifying the capability of features to discriminate instances from different classes. Multiple feature selection methods have shown that different feature subset can reduce the data dimensionality and maintain or even improve the classification accuracy. However, different features can have different abilities to distinguish instances of one class from the other classes, which makes the feature selection process a difficult task by finding the optimal feature subset weighting vectors for each class. Motivated by this observation, feature selection and feature weighting could be seen as a BLOP (Bi-Level Optimization Problem) where the feature selection is performed in the upper level, and the feature weighting is applied in the lower level by performing mutliple followers, each of which generates a set of weighting vectors for each class. Only the optimal feature subset weighting vector is retrieved for each class. In this paper, we propose a bi-level evolutionary approach for class-dependent feature selection and weighting using Genetic Algorithm (GA), called Bi-level Class-Dependent Weighted Feature Selection (BCDWFS). The basic idea of our BCDWFS is to exploit the bi-level model for performing upper level feature selection and lower level feature weighting with the aim of finding the optimal weighting vectors of a subset of features for each class. Our approach has been assessed on ten datasets and compared to three existing approaches, using three different classifiers for accuracy evaluation. Experimental results show that our proposed algorithm gives competitive and better results with respect to the state-of-the-art algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call