Abstract

Sensitivity analysis on a neural network is mainly investigated after the network has been designed and trained. Very few have considered this as a critical issue prior to network design. Piche's statistical method (1992, 1995) is useful for multilayer perceptron (MLP) design, but too severe limitations are imposed on both input and weight perturbations. This paper attempts to generalize Piche's method by deriving an universal expression of MLP sensitivity for antisymmetric squashing activation functions, without any restriction on input and output perturbations. Experimental results which are based on, a three-layer MLP with 30 nodes per layer agree closely with our theoretical investigations. The effects of the network design parameters such as the number of layers, the number of neurons per layer, and the chosen activation function are analyzed, and they provide useful information for network design decision-making. Based on the sensitivity analysis of MLP, we present a network design method for a given application to determine the network structure and estimate the permitted weight range for network training.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call