Abstract

Decision trees are highly favoured classifiers because of the resemblance of their understandable nature to the branched process of human thinking. But the comprehensible rationality of these trees can be severely affected by the bias in the selection of the split attribute, and the traditional heuristic methods appear to be multi-value. The present paper proposes an attribute selection method for nodes on the basis of the concept model of decision trees in purpose of avoiding the heuristic bias of attribute measurement and improving the performance of decision trees. The probabilistic statistics form is used to define and express the concept model extracted from the given data of things and created by associated certainty of classes distribution and branches distribution to fulfil certainty description of tree. And class constraint uncertainty (CCE) is used as a heuristic measure in the induction of tree to select the split attribute while the processing of the missing branch as an auxiliary leaf measure to construct a novel algorithm of decision tree learning. Experimental findings show that CCE is effective as a heuristic measure to avoid the bias in the selection of the multi-value attribute to all datasets and improve the performance and stability of the decision trees.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.