Abstract

The artificial neural network (ANN) is becoming a very popular model for engineering and scientific applications. Inspired by brain architecture, artificial neural networks represent a class of nonlinear models capable of learning from data. Neural networks have been applied in many areas, including pattern matching, classification, prediction, and process control. This article focuses on the construction of prediction intervals. Previous statistical theory for constructing confidence intervals for the parameters (or the weights in an ANN), is inappropriate, because the parameters are unidentifiable. We show in this article that the problem disappears in prediction. We then construct asymptotically valid prediction intervals and also show how to use the prediction intervals to choose the number of nodes in the network. We then apply the theory to an example for predicting the electrical load.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call