Abstract

GMDH, which stands for Group Method Data Handling, is an evolutionary type of neural network. It has received much attention in the supercomputing research community because of its ability to optimize its internal structure for maximum prediction accuracy. GMDH works by evolving itself from a basic network, expanding its number of neurons and hidden layer until no further performance gain can be obtained. Earlier on, the authors proposed a novel strategy that extends existing GMDH neural network techniques. The new strategy, called residual-feedback, retains and reuses past prediction errors as part of the multivariate sample data that provides relevant multivariate inputs to the GMDH neural networks. This is important because the strength of GMDH, like any neural network, is in predicting outcomes from multivariate data, and it is very noise-tolerant. GMDH is a well-known ensemble type of prediction method that is capable of modeling highly non-linear relations. Maximum accuracy is often achieved by using only the minimum amount of network neurons and simplest layered structure. This paper contributes to the technical design of implementing GMDH on GPU memory where all the weight computations run on parallel GPU memory blocks. It is a first step towards developing complex neural network architecture on GPU with the capability of evolving and expanding its structure to minimally sufficient for obtaining the maximum prediction accuracy based on the given input data.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call