Abstract

Feature selection is a process that selects some important features from original feature set. Many existing feature selection algorithms based on information theory concentrate on maximizing relevance and minimizing redundancy. In this paper, relevance and redundancy are extended to conditional relevance and conditional redundancy. Because of the natures of the two conditional relations, they tend to produce more accurate feature relations. A new frame integrating the two conditional relations is built in this paper and two new feature selection methods are proposed, which are Minimum Conditional Relevance-Minimum Conditional Redundancy (MCRMCR) and Minimum Conditional Relevance-Minimum Intra-Class Redundancy (MCRMICR) respectively. The proposed methods can select high class-relevance and low-redundancy features. Experimental results for twelve datasets verify the proposed methods perform better on feature selection and have high classification accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call