Abstract

In the present work, a constructive learning algorithm is employed to design an optimal one-hidden neural network structure that best approximates a given mapping. The method determines not only the optimal number of hidden neurons but also the best activation function for each node. Here, the projection pursuit technique is applied in association with the optimization of the solvability condition, giving rise to a more efficient and accurate computational learning algorithm. As each activation function of a hidden neuron is optimally defined for every approximation problem, better rates of convergence are achieved. Since the training process operates the hidden neurons individually, a pertinent activation function employing Hermite polynomials can be iteratively developed for each neuron as a function of the learning set. The proposed constructive learning algorithm was successfully applied to identify a large-scale multivariate process, providing a multivariable model that was able to describe the complex process dynamics, even in long-range horizon predictions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call