Abstract

Summary form only given, as follows. The possible existence of local minima in the error surfaces of backpropagation neural networks has been an important unanswered question. Evidence has demonstrated that error surface regions of small slope with a high mean square error are frequently encountered during training. Such regions are often mistakenly believed to be local minima since no significant decrease in error occurs over considerable training time. In many cases, if training is continued, the shallow region is traversed. Given these experiences, it became plausible to suggest that backpropagation error surfaces have no local minima. A discussion is presented of the results of the exploration of the error surface for two networks, and the discovery of a true local minimum is documented.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">&gt;</ETX>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call