Abstract

The decision tree methodology is an important nonparametric technique for building classifiers from a set of training examples. Most of the existing top-down decision tree design methods make use of single feature splits at successive stages of the tree design. While computationally attractive, single feature splits generally lead to large trees and inferior performance. This paper presents a new top-down decision tree design method that generates compact trees of superior performance by using multifeature splits in place of single feature splits at successive stages of the tree development. The multifeature splits in the proposed method are obtained by combining the concept of information measure of a partition with perceptron learning. Several decision tree induction results for a broad range of classification problems are presented to demonstrate the strengths of the proposed decision tree design methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.