Abstract

Alternating decision tree (ADTree) is a special decision tree representation that brings interpretability to boosting, a well-established ensemble algorithm. This has found success in wide applications. However, existing variants of ADTree are implementing univariate decision nodes where potential interactions between features are ignored. To date, there has been no multivariate ADTree. We propose a sparse version of multivariate ADTree such that it remains comprehensible. The proposed sparse ADTree is empirically tested on UCI datasets as well as spectral datasets from the University of Eastern Finland (UEF). We show that sparse ADTree is competitive against both univariate decision trees (original ADTree, C4.5, and CART) and multivariate decision trees (Fisher's decision tree and a single multivariate decision tree from oblique Random Forest). It achieves the best average rank in terms of prediction accuracy, second in terms of decision tree size and faster induction time than existing ADTree. In addition, it performs especially well on datasets with correlated features such as UEF spectral datasets. Thus, the proposed sparse ADTree extends the applicability of ADTree to a wider variety of applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.