Abstract

In this paper, we present the improving capability of accuracy and the parallel efficiency of ensemble self-generating neural networks (ESGNNs) for classification on a MIMD parallel computer. Self-generating neural networks (SGNNs) are originally proposed for classification or clustering by automatically constructing self-generating neural tree (SGNT) from given training data. ESGNNs are composed of plural SGNTs each of which is independently generated by shuffling the order of the given training data, and the output of ESGNNs are averaged outputs of the SGNTs. We allocate each of SGNTs to each of processors in the MIMD parallel computer. Experimental results show that the more the number of processors, the more the misclassification rate decreases for all problems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call