Abstract

In the present work, a constructive learning algorithm is employed to design an optimal one-hidden layer neural network structure that best approximates a given mapping. The method determines not only the optimal number of hidden neurons but also the best activation function for each node. Here, the projection pursuit technique is applied in association with the optimization of the solvability condition, giving rise to a more efficient and accurate computational learning algorithm. As each activation function of a hidden neuron is optimally defined for every approximation problem, better rates of convergence are achieved. The proposed constructive learning algorithm was successfully applied to identify a large-scale multivariate process, providing a multivariable model that was able to describe the complex process dynamics, even in long-range horizon predictions. The resulting identification model is then considered as part of a model-based predictive control strategy, with high-quality performance in closed-loop experiments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call