Abstract

Ensembles of artificial neural networks have been used in the last years as classification/regression machines, showing improved generalization capabilities that outperform those of single networks. However, it has been recognized that for aggregation to be effective the individual networks must be as accurate and diverse as possible. An important problem is, then, how to tune the aggregate members in order to have an optimal compromise between these two conflicting conditions. We propose here a simple method for constructing regression/classification ensembles of neural networks that leads to overtrained aggregate members with an adequate balance between accuracy and diversity. The algorithm is favorably tested against other methods recently proposed in the literature, producing an improvement in performance on the standard statistical databases used as benchmarks. In addition, and as a concrete application, we apply our method to the sunspot time series and predict the remainder of the current cycle 23 of solar activity.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.