Abstract

Instance-level contrastive learning such as SimCLR has been successful as a powerful method for representation learning. However, SimCLR suffers from problems of sampling bias, feature bias and model collapse. A set-level based Sampling Enhanced Contrastive Learning (SECL) method based on SimCLR is proposed in this paper. We use the proposed super-sampling method to expand the augmented samples into a contrastive-positive set, which can learn class features of the target sample to reduce the bias. The contrastive-positive set includes Augmentations (the original augmented samples) and Neighbors (the super-sampled samples). We also introduce a samples-correlation strategy to prevent model collapse, where a positive correlation loss or a negative correlation loss is computed to adjust the balance of model’s Alignment and Uniformity. SECL reaches 94.14% classification precision on SST-2 dataset and 89.25% on ARSC dataset. For the multi-class classification task, SECL achieves 90.99% on AGNews dataset. They are all about 1% higher than the precision of SimCLR. Experiments show that the training convergence of SECL is faster, and SECL reduces the risk of bias and model collapse.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call