Abstract
Balanced ensemble learning was developed from negative correlation learning by shifting the learning targets. From the different learning behaviors in balance ensemble learning for the two structures of neural network ensembles on both low noisy data and high noisy data, a number of new discoveries are revealed in this paper. The first discovery is that the ensembles with small neural networks by balanced ensemble learning could perform as well as the ensembles with large neural networks by negative correlation learning. The second discovery is that there is seldom overfitting in balanced ensemble learning for the ensembles with small neural networks. In contrast, overfitting had been observed in balanced ensemble learning for the ensembles with large neural networks on both low noisy data and high noisy data. The third discovery is that both the large and the small mean squared errors could lead to overfitting. Overfitting rather than underfitting arising from the larger mean squared error might come out at a surprise. The explanations of such a rare phenomenon are presented in this paper.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.