Abstract

Ensemble pruning searches for a selective subset of members that performs as well as, or better than ensemble of all members. However, in the accuracy / diversity pruning framework, generalization ability of target ensemble is not considered, and moreover, there is not clear relationship between them. In this paper, we proof that ensemble formed by members of better generalization ability is also of better generalization ability. We adopt learning with both labeled and unlabeled data to improve generalization ability of member learners. A data dependant kernel determined by a set of unlabeled points is plugged in individual kernel learners to improve generalization ability, and ensemble pruning is launched as much previous work. The proposed method is suitable for both single-instance and multi-instance learning framework. Experimental results on 10 UCI data sets for single-instance learning and 4 data sets for multi-instance learning show that subensemble formed by the proposed method is effective.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call