Abstract

Decision-tree-based support vector machine which combines support vector machines and decision tree is an effective way for solving multi-class problems. A problem exists in this method is that the division of the feature space depends on the structure of a decision tree, and the structure of the tree relate closely to the performance of the classifier. To maintain high generalization ability, the most separable classes should be separated at the upper nodes of a decision tree. Distance measure is often used as a separability measure between classes, but the distance between class centers can not reflect the distribution of the classes. After analyzing the tree structure and the classification performance of the decision-tree-based support vector machine, a new separability measure is defined based on the distribution of the training samples in the feature space, the defined separability measure was used in the formation of the decision tree, and an improved algorithm for decision-tree-based support vector machine is proposed. Classification experiments prove the effectiveness of the improved algorithm for decision-tree-based support vector machine.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.