Abstract

In order to prevent the individual neural networks from becoming similar in the long learning period of negative correlation learning for designing neural network ensembles, two approaches were adopted in this paper. The first approach is to replace large neural networks with small neural networks in neural network ensembles. Samll neural networks would be more practical in the real applications when the capability is limited. The second approach is to introduce random separation learning in negative correlation learning for each small neural network. The idea of random separation learning is to let each individual neural network learn differently on the randomly separated subsets of the given training samples. It has been found that the small neural networks could easily become weak and different each other by negative correlation learning with random separation learning. After applying large number of small neural networks for neural network ensembles, two combination methods were used to generate the output of the neural network ensembles while their performance had been compared.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.