Abstract
This paper addresses the classification problem in machine learning, focusing on predicting class labels for datasets with continuous features. Recognizing the critical role of discretization in enhancing classification performance, the study integrates equal width binning (EWB) with two optimization algorithms: the bat algorithm (BA), referred to as EB, and the whale optimization algorithm (WOA), denoted as EW. The primary objective is to determine the optimal technique for predicting relevant class labels. The paper emphasizes the significance of discretization in data preprocessing, offering a comprehensive approach that combines discretization techniques with optimization algorithms. An investigative study was undertaken to assess the efficiency of EB and EW by evaluating their classification performance using Naive Bayes and K-nearest neighbor algorithms on four continuous datasets sourced from the UCI datasets. According to the experimental findings, the suggested EB has a major effect on the accuracy, recall, and F-measure of data classification. The classification performance using EB outperforms other existing approaches for all datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Indonesian Journal of Electrical Engineering and Computer Science
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.