Abstract

Various supervised embedded methods have been proposed to select discriminative features from original ones, such as Feature Selection with Orthogonal Regression (FSOR) and Robust Feature Selection. Compared with embedded methods based on the least square regression, FSOR, utilizing orthogonal regression, can preserve more discriminative information in the subspace and have better performance on feature selection. However, the embedded approaches have scarcely considered the dependency among the selected feature subset. To address the defect, in this paper, we propose a two-stage (filter-embedded) feature selection technique based on Maximum Relevance Minimum Redundancy and FSOR, termed as Orthogonal Regression with Minimum Redundancy (ORMR). We compared the feature selection performance between ORMR and nine other state-of-the-art supervised feature selection methods on six benchmark datasets. The results demonstrate the advantage of ORMR method over others in choosing discriminative features with considering the redundant information among the selected feature subset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call