Abstract
In this paper we propose a new way to combine classifiers trained and diverged by NCL that leads to superior results in compare with averaging and DTs methods alone. In the proposed method after training classifiers by NCL, DTs and averaging are employed independently to combine them and then the outputs are combined again by averaging. For the second level combination, support vectors are scaled to the [0,1] interval. We show that each method used in the first level of combination yields a better performance in a different part of dataset so that they can complement each other. We did two sets of experiments, one on Satimage dataset and the other one on ORL dataset which showed a better performance for our method.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have