Abstract

The number of hidden neurons of feed-forward neural networks is generally decided on the basis of experience. The method usually results in the lack or redundancy of hidden neurons, and causes the shortage of capacity for storing information or learning overmuch. This research proposes a new method for optimizing the number of hidden neurons based on information entropy. Firstly, an initial neural network with enough hidden neurons should be trained by a set of training samples. Secondly, the activation values of hidden neurons should be calculated by inputting the training samples that can be identified correctly by the trained neural network. Thirdly, all kinds of partitions should be tried and its information gain should be calculated, and then a decision tree for correctly dividing the whole sample space can be constructed. Finally, the important and related hidden neurons that are included in the tree can be found by searching the whole tree, and other redundant hidden neurons can be deleted. Thus, the number of hidden neurons can be decided. Taking a neural network with the best number of hidden units for tea quality evaluation as an example, the result shows that the method is effective.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.