Abstract

Feature selection or attribute reduction is an important data preprocessing technique for dimensionality reduction in machine learning and data mining. In this paper, a novel feature selection ensemble learning algorithm is proposed based on Tsallis entropy and Dempster–Shafer evidence theory (TSDS). First, an improved correlation criterion is used to obtain the relevant feature based on Tsallis entropy. A forward sequential approximate Markov blanket is then defined to eliminate the redundant feature. An ensemble learning is employed to achieve approximately optimal global feature selection, which can acquire the feature subsets from different perspectives. Finally, by fusing all the feature subsets, the improved evidence theory approach is utilized to gain the final feature subset. To verify the effectiveness of TSDS, nine datasets from two different domains are used in the experimental analysis. The experimental results demonstrate that the proposed algorithm can select feature subset more effectively and enhance the classification performance significantly.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call