Abstract

SummarySSL (semi‐supervised learning) is widely used in machine learning, which leverages labeled and unlabeled data to improve model performance. SSL aims to optimize class mutual information, but noisy pseudo‐labels introduce false class information due to the scarcity of labels. Therefore, these algorithms often need significant training time to refine pseudo‐labels for performance improvement iteratively. To tackle this challenge, we propose a novel plug‐and‐play method named Accelerating semi‐supervised learning via contrastive learning (ASCL). This method combines contrastive learning with uncertainty‐based selection for performance improvement and accelerates the convergence of SSL algorithms. Contrastive learning initially emphasizes the mutual information between samples as a means to decrease dependence on pseudo‐labels. Subsequently, it gradually turns to maximizing the mutual information between classes, aligning with the objective of semi‐supervised learning. Uncertainty‐based selection provides a robust mechanism for acquiring pseudo‐labels. The combination of the contrastive learning module and the uncertainty‐based selection module forms a virtuous cycle to improve the performance of the proposed model. Extensive experiments demonstrate that ASCL outperforms state‐of‐the‐art methods in terms of both convergence efficiency and performance. In the experimental scenario where only one label is assigned per class in the CIFAR‐10 dataset, the application of ASCL to Pseudo‐label, UDA (unsupervised data augmentation for consistency training), and Fixmatch benefits substantial improvements in classification accuracy. Specifically, the results demonstrate notable improvements in respect of 16.32%, 6.9%, and 24.43% when compared to the original outcomes. Moreover, the required training time is reduced by almost 50%.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.