Abstract

The problem of data encoding and feature selection for training backpropagation neural networks is well known. The basic principles are to avoid encrypting the underlying structure of the data, and to avoid using irrelevant inputs. In the real world we often receive data which has been processed by at least one previous user. The data may contain too many instances of some class, too few instances of other classes, and often include many irrelevant or redundant fields. Previous approaches have focussed on the analysis of the weight matrix of trained networks to determine the magnitude of contribution particular inputs make to the output to determine which are less significant. This paper examines measures to determine the functional contribution of inputs to outputs. Inputs which include minor but unique information to the network are more significant than inputs with higher magnitude contribution but providing redundant information also provided by another input. This paper presents a novel functional analysis of the weight matrix based on a technique developed for determining the behavioural significance of hidden neurons. This is compared with the application of the same technique to the training and test data available. Finally, a novel aggregation technique is introduced.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.