Abstract

In recent years, feature selection based on relevance redundancy trade-off criteria has become a very promising and popular approach in the field of machine learning. However, the existing algorithmic frameworks of mutual information feature selection have certain limitations for the common feature selection problems in practice. To overcome these limitations, the idea of a new framework is developed by introducing a novel maximum relevance and minimum common redundancy criterion and a minimax nonlinear optimization approach. In particular, a novel mutual information feature selection method based on the normalization of the maximum relevance and minimum common redundancy (N-MRMCR-MI) is presented, which produces a normalized value in the range [0, 1] and results in a regression problem. We perform extensive experimental comparisons over numerous state-of-art algorithms using different forecasts (Bayesian Additive Regression tree, treed Gaussian process, k-NN, and SVM) and different data sets (two simulated and five real datasets). The results show that the proposed algorithm outperforms the others in terms of feature selection and forecasting accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call