Abstract

Neural network tree (NNTree) is a decision tree (DT) with each non-terminal node containing an expert neural network (ENN). Generally speaking, NNTrees can outperform standard axis-parallel DTs because the ENNs can extract more complex features. However, induction of multivariate DTs is very difficult. Even if each non-terminal node contains a simple oblique hyperplane, finding the optimal test function is an NP-complete problem. To solve this problem, we have studied two evolutionary algorithms (EAs). One is to induce the NNTrees by applying the genetic algorithm (GA) recursively, and another is to evolve the NNTrees directly. These two algorithms, however, are very time consuming and cannot be used easily. This paper proposes a new EA by combining GA and the back propagation (BP) algorithm. Here, GA is used for finding the structure of the NNTree, and BP is used for training the ENNs. Experimentai resufts with 10 public databases show that the proposed algorithm is much more efficient and effective than existing ones.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call