Abstract

Pseudo-labeling is a simple and well known strategy in Semi-Supervised Learning with neural networks. The method is equivalent to entropy minimization as the overlap of class probability distribution can be reduced minimizing the entropy for unlabeled data. In this paper we review the relationship between the two methods and evaluate their performance on Fine-Grained Visual Classification datasets. We include also the recent released iNaturalist-Aves that is specifically designed for Semi-Supervised Learning. Experimental results show that although in some cases supervised learning may still have better performance than the semi-supervised methods, Semi Supervised Learning shows effective results. Specifically, we observed that entropy-minimization slightly outperforms a recent proposed method based on pseudo-labeling.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call