Abstract

Support vector machines (SVMs) are designed to solve the binary classification problems at the beginning, but in the real world, there are a lot of multiclassification cases. The multiclassification methods based on SVM are mainly divided into the direct methods and the indirect methods, in which the indirect methods, which consist of multiple binary classifiers integrated in accordance with certain rules to form the multiclassification model, are the most commonly used multiclassification methods at present. In this paper, an improved multiclassification algorithm based on the balanced binary decision tree is proposed, which is called the IBDT-SVM algorithm. In this algorithm, it considers not only the influence of “between-classes distance” and “class variance” in traditional measures of between-classes separability but also takes “between-classes variance” into consideration and proposes a new improved “between-classes separability measure.” Based on the new “between-classes separability measure,” it finds out the two classes with the largest between-classes separability measure and uses them as the positive and negative samples to train and learn the classifier. After that, according to the principle of the class-grouping-by-majority, the remaining classes are close to these two classes and merged into the positive samples and the negative samples to train SVM classifier again. For the samples with uneven distribution or sparse distribution, this method can avoid the error caused by the shortest canter distance classification method and overcome the “error accumulation” problem existing in traditional binary decision tree to the greatest extent so as to obtain a better classifier. According to the above algorithm, each layer node of the decision tree is traversed until the output classification result is a single-class label. The experimental results show that the IBDT-SVM algorithm proposed in this paper can achieve better classification accuracy and effectiveness for multiple classification problems.

Highlights

  • Support vector machines (SVMs) are designed to solve the binary classification problems at the beginning, but in the real world, there are a lot of multiclassification cases. e multiclassification methods based on SVM are mainly divided into the direct methods and the indirect methods, in which the indirect methods, which consist of multiple binary classifiers integrated in accordance with certain rules to form the multiclassification model, are the most commonly used multiclassification methods at present

  • An improved multiclassification algorithm based on the balanced binary decision tree is proposed, which is called the IBDT-SVM algorithm

  • According to the above algorithm, each layer node of the decision tree is traversed until the output classification result is a single-class label. e experimental results show that the IBDT-SVM algorithm proposed in this paper can achieve better classification accuracy and effectiveness for multiple classification problems

Read more

Summary

The Introduction of Support Vector Classifier

Setting a set, which includes m samples: S 􏼈(x1, y1), (x2, y2), ..., (xm, ym)}, in which xi ∈ Rn, y ∈ (−1, +1) (i 1, 2, . . . , l). Different from other classification algorithms, SVM can well solve the problems of data classification in high-dimensional space by the kernel functions. It maps the samples from the input space to the higher dimensional space by this mapping: φ: Rn ⟶ F, and it can construct the optimal classification hyperplane to separate the samples in the highdimensional feature space. In this way, the problem above is transformed into the following optimization problem: min.

The Analysis of the Multiclassification SVM Algorithms
The Design of IBDT-SVM Algorithm
Positive
The Numerical Experiments and Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call