Abstract

Ensemble learning is an algorithm that utilizes various types of classification models. This algorithm can enhance the prediction efficiency of component models. However, the efficiency of combining models typically depends on the diversity and accuracy of the predicted results of ensemble models. However, the problem of multi-class data is still encountered. In the proposed approach, cost-sensitive learning was implemented to evaluate the prediction accuracy for each class, which was used to construct a cost-sensitivity matrix of the true positive (TP) rate. This TP rate can be used as a weight value and combined with a probability value to drive ensemble learning for a specified class. We proposed an ensemble model, which was a type of heterogenous model, namely, a combination of various individual classification models (support vector machine, Bayes, K-nearest neighbour, naïve Bayes, decision tree, and multi-layer perceptron) in experiments on 3-, 4-, 5- and 6-classifier models. The efficiencies of the propose models were compared to those of the individual classifier model and homogenous models (Adaboost, bagging, stacking, voting, random forest, and random subspaces) with various multi-class data sets. The experimental results demonstrate that the cost-sensitive probability for the weighted voting ensemble model that was derived from 3 models provided the most accurate results for the dataset in multi-class prediction. The objective of this study was to increase the efficiency of predicting classification results in multi-class classification tasks and to improve the classification results.

Highlights

  • The problem of Multiclass classification with decision making processes is a fundamental problem in supervised learning which is an important problem in classify results by multiple label-classes [1,2,3]

  • It focuses on the combination strategy process, which is a combination of weighted voting methods, which is a method for assigning appropriate weight values to classifications that do not affect the effectiveness of the model and the accuracy of the classification is derived from the improvement of assigning weight

  • The framework that we have proposed is costsensitive probability weighted based on ensemble learning, which differs from the initial classification work [42, 62] that focuses on the pattern recognition of The classification of an individual base model that does not take into account the predisposition of the class result data obtained from predictions from the model may lead to information bias

Read more

Summary

Introduction

The problem of Multiclass classification with decision making processes is a fundamental problem in supervised learning which is an important problem in classify results by multiple label-classes [1,2,3]. There is a complex decision-making class classification results that are difficult to manage [29] To deal with this important problem, there are 2 main methods that are used to create classification methods for multi-class data: The traditional base model method and the ensemble model method. Many studies [40, 56,57,58] attempt to optimize the probability weight obtained from the prediction of the class results from the ensemble model It focuses on the combination strategy process, which is a combination of weighted voting methods, which is a method for assigning appropriate weight values to classifications that do not affect the effectiveness of the model and the accuracy of the classification is derived from the improvement of assigning weight. The selection of the result class from a single pattern without randomization appropriate randomization results in incorrect results class predictions due to incorrect probability weight assignments. [5, 60, 61] Current work is trying to improve the efficiency of the method combination by setting cost new cost-sensitive weight (new assigning weight) for more accurate probability weight of class result

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call