Abstract

Classification algorithms have a very important role in Machine Learning, but not all algorithms have the same performance in every case. Algorithm performance can be affected by the type of data used, differences in problem characteristics, and the parameters used. Additionally, ensemble learning techniques such as Bagging can affect algorithm performance. Therefore, the problem arises of how to choose the most suitable algorithm for a particular classification task and how to optimize the performance of the algorithm. This research aims to carry out a comparative analysis and optimization of classification algorithms in Machine Learning. Classification algorithms that will be evaluated include Support Vector Machine (SVM), Neural Network, Logistic Regression, Decision Tree, and K-Nearest Neighbors (K-NN). Evaluation of the performance of these algorithms will be carried out using the confusion matrix, Receiver Operating Characteristic (ROC) Curve, and Area Under Curva (AUC). The result of this research is a comparative analysis of the optimization of classification algorithms using the bagging technique. After carrying out the evaluation process using the confusion matrix and ROC curve, it was found that the algorithm optimization using the bagging technique only had an effect on the Decision Tree (DT) and K-Nearest Neighbors (KNN) algorithms. . The accuracy of the DT algorithm increased by 0.6% while the accuracy of KNN increased by 1.3%. The AUC value for the DT algorithm increased by 1.4% and the KNN algorithm increased by 0.3%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call