Abstract

Mixture of experts is a neural network based ensemble learning approach consisting of several experts and a gating network. In this paper, we introduce regularized root–quartic mixture of experts (R-RTQRT-ME) by incorporating a regularization term into the error function to control the complexity of model and to increase robustness in confronting with over-fitting and noise. The average of the results of R-RTQRT-ME on 20 classification benchmark datasets, shows that this algorithm performs 1.75%, 2.50%, 2.29% better than multi objective regularized negative correlation learning, multi objective negative correlation learning and multi objective neural network, respectively. Also, the average of improvements of R-RTQRT-ME is 1.16%, 2.31%, 3.40%, 3.39% in comparison with root-quartic mixture of experts, mixture of negatively correlated experts, mixture of experts and negative correlation learning, respectively. Furthermore, the effect of the regularization penalty term in R-RTQRT-ME on noisy data is analyzed which shows the robustness of R-RTQRT-ME in these situations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call