Abstract
Feature selection (FS) is a common preprocessing step of machine learning that selects informative subset of features which fuels a model to perform better during prediction or classification. It helps in the design of an intelligent and expert system used in computer vision, image processing, gene expression data analysis, intrusion detection and natural language processing. In this paper, we introduce an effective filter method called Joint Mutual Information with Class relevance (JoMIC) using multivariate Joint Mutual Information (JMI) and Mutual Information (MI). Our method considers both JMI and MI of a non selected feature with selected ones w.r.t a given class to select a feature that is highly relevant to the class but non redundant to other selected features. We compare our method with seven other filter-based methods using the machine learning classifiers viz., Logistic Regression, Support Vector Machine, K-nearest Neighbor (KNN), Decision Tree, Random Forest, Naïve Bayes, and Stochastic Gradient Descent on various datasets. Experimental results reveal that our method yields better performance in terms of accuracy, Matthew’s Correlation Coefficient (MCC) and F1-score over 16 benchmark datasets, as compared to other competent methods. The superiority of our proposed method is that it uses an effective objective function that combines both JMI and MI to choose the relevant and non redundant features.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Computational Mathematics and Data Science
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.