Abstract

In this work we propose a novel method for subset feature selection based on mutual information. We introduce structural parameters that define the probability of selecting each feature. These parameters are adjusted by maximizing the information that sampled groups of features have about a class and at the same time minimizing the size of the group. After training, the parameters are used to select a subset of features. Results on four synthetic datasets are reported, where each dataset poses a different challenge, ranging from finding synergy to avoiding redundancy. We compare these results with those of eight other mutual information based feature selection methods. Our method outperforms the other eight feature selection methods on the four synthetic datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call