Abstract

Decision tree learning algorithms have a long history, but they are still popular due to their efficiency. Tree construction starts at the root and continues to each leaf node, creating a “near-optimal” tree. One of the key steps in creating a decision tree is the selection of a feature to split at each root node, which affects the classification accuracy. This process can be quite labor intensive. The article proposes a new approach to constructing decision trees based on the use of differential evolution for two-level tree optimization. Differential evolution works in two stages: at the first level, a feature is selected for separation, and at the second level, the threshold value for this feature is optimized. The results of the work were tested on several examples of classification problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call