Abstract
This paper shows a quantitative relation between the regularization techniques, the generalization ability, and the sensitivity of the Multilayer Perceptron (MLP) to input noise. Although many studies about these topics have been presented, in most cases only one of the problems is addressed, and only experimentally obtained evidence is provided to illustrate some kind of correlation between generalization, noise immunity and the use of regularization techniques to obtain a set of weights after training that provides the corresponding MLP with generalization ability and noise immunity. Here, a new measurement of noise immunity for a MLP is presented. This measurement, which is termed Mean Squared Sensitivity (MSS), explicitly evaluates the Mean Squared Error (MSE) degradation of a MLP when it is perturbed by input noise, and can be computed from the statistical sensitivities (previously proposed) of the output neurons. The MSS provides an accurate evaluation of the MLP performance loss when its inputs are perturbed by noise and can also be considered a measurement of the smoothness of the error surface with respect to the inputs. Thus, as the MSS can be used to evaluate the noise immunity or the generalization ability, it gives a criterion to select among different weight configurations that present a similar MSE after training.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.