Feature selection has commonly been used to remove irrelevant features and improve classification performance. Some of features are irrelevant to the learning process; therefore to remove these irrelevant features not only decreases training and testing times, but can also improve learning accuracy. This study proposes a novel supervised feature selection method based on the bounded sum of weighted fuzzy membership functions (BSWFM) and Euclidean distances between their centers of gravity for decreasing the computational load and improving accuracy by removing irrelevant features. This study compares the performance of a neural network with a weighted fuzzy membership function (NEWFM) without and with the proposed feature selection method. The superiority of the NEWFM with feature selection over NEWFM without feature selection was demonstrated using three experimental datasets from the UCI Machine Learning Repository: Statlog Heart, Parkinsons and Ionosphere. 13 features, 22 features, and 34 features were used as inputs for the NEWFM without feature selection and these resulted in performance accuracies of 85.6%, 86.2% and 91.2%, respectively, using Statlog Heart, Parkinsons and Ionosphere datasets. 10 minimum features, 4 minimum features and 25 minimum features were used as inputs for the NEWFM with feature selection and these resulted in performance accuracies of 87.4%, 88.2%, and 92.6%, respectively, using Statlog Heart, Parkinsons and Ionosphere datasets. The results show that NEWFM with feature selection performed better than NEWFM without feature selection.
Read full abstract