Abstract

AdaBoost.M1 has been successfully applied to improve the accuracy of a learning algorithm for multi-class classification problems. However, it may be hard to satisfy the required conditions in some practical cases. An improved algorithm called AdaBoost.MK is developed to solve this problem. Early proposed support vector machines (SVM)-based, multi-class classification algorithms work by splitting the original problem into a set of two-class subproblems. The amount of time and space required by these algorithms is very demanding. We develop a multi-class classification algorithm by incorporating one-class SVMs with a well-designed discriminant function. Finally, a hybrid method integrating AdaBoost.MK and one-class SVMs is proposed to solve multi-class classification problems. Experimental results on data sets from UCI and Statlog show that the proposed approach outperforms other multi-class algorithms, such as support vector data descriptions (SVDDs) and AdaBoost.M1 with one-class SVMs, and the improvement is found to be statistically significant.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call