Abstract

Although artificial neural networks are employed in an ever growing variety of applications, their inner workings are still viewed as a black box, which is due to the complexity of the non-linear dynamics that govern neural network learning. The key parameters in this learning process are the so called interconnection strengths or weights of the connections between the neurons. Because of the lack of data, mathematical approaches for studying the ‘inside’ of neural networks have to resort to assumptions like a Normal distribution of the weight values. In order to better understand what goes on inside neural networks, a thorough study of the real probability distribution of the weight values is important. Besides this, knowledge about weight distributions is also a main ingredient for weight reduction schemes enabling the creation of partially connected neural networks and for network capacity calculations. This paper reports on the findings of an extensive empirical study of the distributions of weights in backpropagation neural networks, and tests formally whether the weights of a trained neural network have indeed a Normal distribution.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call