Abstract

The design of algorithms that explore multiple representation languages and explore different search spaces has an intuitive appeal. In the context of classification problems, algorithms that generate multivariate trees are able to explore multiple representation languages by using decision tests based on a combination of attributes. The same applies to model-tree algorithms in regression domains, but using linear models at leaf nodes. In this paper, we study where to use combinations of attributes in decision tree learning. We present an algorithm for multivariate tree learning that combines a univariate decision tree with a discriminant function by means of constructive induction. This algorithm is able to use decision nodes with multivariate tests, and leaf nodes that predict a class using a discriminant function. Multivariate decision nodes are built when growing the tree, while functional leaves are built when pruning the tree. Functional trees can be seen as a generalization of multivariate trees. Our algorithm was compared against to its components and two simplified versions using 30 benchmark data sets. The experimental evaluation shows that our algorithm has clear advantages with respect to the generalization ability and model sizes at statistically significant confidence levels.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call