Abstract

Feature selection has attracted a lot of attention in obtaining discriminative and non-redundant features from high-dimension data. Compared with traditional filter and wrapper methods, embedded methods can obtain a more informative feature subset by fully considering the importance of features in the classification tasks. However, the existing embedded methods emphasize the above importance of features and mostly ignore the correlation between the features, which leads to retain the correlated and redundant features with similar scores in the feature subset. To solve the problem, we propose a novel supervised embedded feature selection framework, called feature selection under global redundancy minimization in orthogonal regression (GRMOR). The proposed framework can effectively recognize redundant features from a global view of redundancy among the features. We also incorporate the large margin constraint into GRMOR for robust multi-class classification. Compared with the traditional embedded methods based on least square regression, the proposed framework utilizes orthogonal regression to preserve more discriminative information in the subspace, which can help accurately rank the importance of features in the classification tasks. Experimental results on twelve public datasets demonstrate that the proposed framework can obtain superior classification performance and redundancy removal performance than twelve other feature selection methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call