Abstract

Feature selection aims to reduce the dimensionality of patterns for classificatory analysis by selecting the most informative instead of irrelevant and/or redundant features.In this study,two novel information-theoretic measures for feature ranking are presented:one is an improved formula to estimate the conditional mutual information between the candidate feature f_i and the target class C given the subset of selected features S,i.e.,I(C;f_i|S),under the assumption that information of features is distributed uniformly;the other is a mutual information(MI)based constructive criterion that is able to capture both irrelevant and redundant input features under arbitrary distributions of information of features.With these two measures,two new feature selection algorithms, called the quadratic MI-based feature selection(QMIFS)approach and the MI-based constructive criterion(MICC)approach, respectively,are proposed,in which no parameters likeβin Battiti's MIFS and(Kwak and Choi)'s MIFS-U methods need to be preset.Thus,the intractable problem of how to choose an appropriate value forβto do the tradeoff between the relevance to the target classes and the redundancy with the already-selected features is avoided completely.Experimental results demonstrate the good performances of QMIFS and MICC on both synthetic and benchmark data sets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.