Abstract

Feature selection is a critical step for pattern recognition and many other applications. Typically, feature selection strategies can be categorized into wrapper and filter approaches. Filter approach has attracted much attention because of its flexibility and computational efficiency. Previously, we have developed an ICA-MI framework for feature selection, in which the Mutual Information (MI) between features and class labels was used as the criterion. However, since this method depends on the linearity assumption, it is not applicable for an arbitrary distribution. In this paper, exploiting the fact that Gaussian Mixture Model (GMM) is generally a suitable tool for estimating probability densities, we propose GMM-MI method for feature ranking and selection. We will discuss the details of GMM-MI algorithm and demonstrate the experimental results. We will also compare the GMM-MI method with the ICA-MI method in terms of performance and computational efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call