Abstract

The single-layer backpropagation algorithm is a gradient-descent method that adjusts the connection weights of a single-layer perceptron to minimize the mean-square error at the output. It is similar to the standard least mean square al- gorithm, except the output of the linear combiner contains a differentiable nonlinearity. In this paper, we present a statis- tical analysis of the mean weight behavior of the single-layer backpropagation algorithm for Gaussian input signals. It is based on a nonlinear system identification model of the desired response which is capable of generating an arbitrary hyper- plane decision boundary. It is demonstrated that, although the weights grow unbounded, the algorithm, on average, quickly learns the correct hyperplane associated with the system iden- tification model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call