Abstract

Mutual Information (MI), a measure from information theory, is widely used in feature selection. Despite its great success, a promising feature property, namely the unique relevance (UR) of a feature, remains unexplored. In this paper, we improve the performance of mutual information based feature selection (MIBFS) by exploring the utility of unique relevance (UR). We provide a theoretical justification for the value of UR and prove that the optimal feature subset must contain all features with UR. Since existing MIBFS follows the criterion of Maximize Relevance with Minimum Redundancy (MRwMR) which ignores UR of features, we augment it to include the objective of boosting unique relevance (BUR). This leads to a new criterion for MIBFS, called MRwMR-BUR. We conduct experiments on six public datasets and the results indicate that MRwMR-BUR consistently outperforms MRwMR when tested with three popular classifiers. We believe this new insight can lead to new optimality bounds and algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call