Abstract

As an alternative to traditional classification methods, semi-supervised learning algorithms have become a hot topic of significant research, exploiting the knowledge hidden in the unlabeled data for building powerful and effective classifiers. In this work, a new ensemble-based semi-supervised algorithm is proposed which is based on a maximum-probability voting scheme. The reported numerical results illustrate the efficacy of the proposed algorithm outperforming classical semi-supervised algorithms in term of classification accuracy, leading to more efficient and robust predictive models.

Highlights

  • The development of a powerful and accurate classifier is considered as one of the most significant and challenging tasks in machine learning and data mining [3]

  • Self-labeled techniques constitute a significant family of classification methods which progressively classify unlabeled data based on the most confident predictions and utilize them to modify the hypothesis learned from labeled samples

  • We focus our attention to Selftraining, Co-training and Tri-training which constitute the most efficient and commonly used self-labeled methods [21, 20, 22, 35, 37, 36]

Read more

Summary

A New Ensemble Semi-supervised Self-labeled Algorithm

As an alternative to traditional classification methods, semi-supervised learning algorithms have become a hot topic of significant research, exploiting the knowledge hidden in the unlabeled data for building powerful and effective classifiers. A new ensemble-based semi-supervised algorithm is proposed which is based on a maximum-probability voting scheme. The reported numerical results illustrate the efficacy of the proposed algorithm outperforming classical semi-supervised algorithms in term of classification accuracy, leading to more efficient and robust predictive models. Povzetek: Razvit je nov delno nadzorovani ucni algoritem s pomocjo ansamblov in glasovalno shemo na osnovi najvecje verjetnosti

Introduction
Related work
A review on semi-supervised self-labeled classification
Co-training
Tri-Training
Experimental results
JRip kNN
Conclusions & future research

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.