Abstract

In this paper, we proposed a learning algorithm using the information entropy minimization heuristic and mutual information entropy heuristic to select expanded attributes. For a data set of which the values of condition attributes are continuous, most of the current decision trees learning algorithms often select the previously selected attributes for branching. The repeated selection limits the accuracy of training and testing and the structure of decision trees may become complex. So in the selection of attributes, the previously selected attributes and the other attributes, which have high correlation to the previously selected attributes, should not be selected again. Here, we use mutual information to avoid selecting the previously selected attributes in the generation of decision trees and our test results show that this method can obtain good performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call