Abstract

Semisupervised learning (SSL) has received a lot of recent attention as it alleviates the need for large amounts of labeled data which can often be expensive, requires expert knowledge, and be time consuming to collect. Recent developments in deep semisupervised classification have reached unprecedented performance and the gap between supervised and SSL is ever-decreasing. This improvement in performance has been based on the inclusion of numerous technical tricks, strong augmentation techniques, and costly optimization schemes with multiterm loss functions. We propose a new framework, LaplaceNet, for deep semisupervised classification that has a greatly reduced model complexity. We utilize a hybrid approach where pseudolabels are produced by minimizing the Laplacian energy on a graph. These pseudolabels are then used to iteratively train a neural-network backbone. Our model outperforms state-of-the-art methods for deep semisupervised classification, over several benchmark datasets. Furthermore, we consider the application of strong augmentations to neural networks theoretically and justify the use of a multisampling approach for SSL. We demonstrate, through rigorous experimentation, that a multisampling augmentation approach improves generalization and reduces the sensitivity of the network to augmentation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call