Abstract
Combining several suitable neural networks can enhance the generalization performance of the group when compared to a single network alone. However, it remains a largely open question, how best to build a suitable combination of individuals. Jacobs and his colleagues proposed the mixture of experts (ME) model, in which a set of neural networks are trained together with a gate network. This tight coupling mechanism enables the system to (i) encourage diversity between the individual neural networks by specializing them in different regions of the input space and (ii) allow for a “good” combination weights of the ensemble members to emerge by training the gate, which computes the dynamic weights together with the classifiers. In this paper, we have wrapped a cooperative coevolutionary (CC) algorithm around the basic ME model. This CC layer allows better exploration of the weight space, and hence, an ensemble with better performance. The results show that CCME is better on average than the original ME on a number of classification problems. We have also introduced a novel mechanism for visualizing the modular structures that emerged from the model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.