Abstract

The problem of variable selection for neural network modeling is discussed in this paper. Two methods that gave the best results in a previous comparative study are presented. One of these methods is a modified version of the Hinton diagrams, the other method is based on saliency estimation and is part of the Optimal Brain Surgeon algorithm for pruning unimportant weights in a neural network. We also propose two new methods, based on the estimation of the contribution of each input variable to the variance of the predicted response. These new methods are designed for situations where input variables are orthogonal, such as the PC scores often used in multivariate calibration. The four methods are tested on synthetic examples, and on real industrial data sets for multivariate calibration. The main characteristics of each method are discussed. In particular, we underline the strong theoretical and experimental limitations of methods like the modified Hinton diagrams, based on weight magnitude estimation. We also demonstrate that although the saliency estimation approach is theoretically more stringent, it gives unstable results on repeated trials. The advantage of the two variance-based approaches is that they are much less dependent on the initial weight randomization than the two other methods, and therefore, the results they produce are more stable and reliable.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.