Abstract

Decision trees play a very important role in knowledge representation because of its simplicity and self-explanatory nature. We study the optimization of the parameters of the decision trees to find a shorter as well as more accurate decision tree. Since these two criteria are in conflict, we need to find a decision tree with suitable parameters that can be a trade off between two criteria. Hence, we design two algorithms to build a decision tree with a given threshold of the number of vertices based on the bi-criteria optimization technique. Then, we calculate the local and global misclassification rates for these trees. Our goal is to study the effect of changing the threshold for the bi-criteria optimization of the decision trees. We apply our algorithms to 13 decision tables from UCI Machine Learning Repository and recommend the suitable threshold that can give us more accurate decision trees with a reasonable number of vertices.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.