Abstract

Ensemble learning, the machine learning paradigm where multiple models are combined, has exhibited promising perfomance in a variety of tasks. The present work focuses on unsupervised ensemble classification. The term unsupervised refers to the ensemble combiner who has no knowledge of the ground-truth labels that each classifier has been trained on. While most prior works on unsupervised ensemble classification are designed for independent and identically distributed (i.i.d.) data, the present work introduces an unsupervised scheme for learning from ensembles of classifiers in the presence of data dependencies. Two types of data dependencies are considered: sequential data and networked data whose dependencies are captured by a graph. For both, novel moment matching and Expectation-Maximization algorithms are developed. Performance of these algorithms is evaluated on synthetic and real datasets, which indicate that knowledge of data dependencies in the meta-learner is beneficial for the unsupervised ensemble classification task.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call