Abstract

In this study we consider a multilayer perceptron network with sigmoidal activation and trained via the backpropagation algorithm. The output of all neurons is collected and a simple linear regression is performed. It is shown that untrained networks with randomly chosen coefficients perform comparably with fully trained networks. This result casts a new light on the role of activation functions, the impact of dimensionality, and the efficacy of training algorithms such as backpropagation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call