Abstract

Two decision trees are called decision equivalent if they represent the same function, i.e., they yield the same result for every possible input. We prove that given a decision tree and a number, to decide if there is a decision equivalent decision tree of size at most that number is NP-complete. As a consequence, finding a decision tree of minimal size that is decision equivalent to a given decision tree is an NP-hard problem. This result differs from the well-known result of NP-hardness of finding a decision tree of minimal size that is consistent with a given training set. Instead our result is a basic result for decision trees, apart from the setting of inductive inference. On the other hand, this result differs from similar results for BDDs and OBDDs: since in decision trees no sharing is allowed, the notion of decision tree size is essentially different from BDD size.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.