Abstract
Semi-supervised learning has achieved impressive results and is commonly applied in text classifications. However, in situations where labeled texts are exceedingly limited, neural networks are prone to over-fitting due to the non-negligible inconsistency between model training and inference caused by dropout mechanisms that randomly mask some neurons. To alleviate this inconsistency problem, we propose a simple Multiple Models Contrast learning based on Consistent Regularization, named Multi-MCCR, which consists of multiple models with the same structure and a C-BiKL loss strategy. Specifically, one sample first goes through multiple identical models to obtain multiple different output distributions, which enriches the sample output distributions and provides conditions for subsequent consistency approximation. Then, the C-BiKL loss strategy is proposed to minimize the combination of the bidirectional Kullback−−Leibler (BiKL) divergence between the above multiple output distributions and the Cross-Entropy loss on labeled data, which provides consistency constraints (BiKL) for the model and simultaneously ensures correct classification (Cross-Entropy). Through the above setting of multi-model contrast learning, the inconsistency caused by the randomness of dropout between model training and inference is alleviated, thereby avoiding over-fitting and improving the classification ability in scenarios with limited labeled samples. We conducted experiments on six widely-used text classification datasets, including sentiment analysis, topic categorization, and reviews classification, and the experimental results show that our method is universally effective in semi-supervised text classification with limited labeled texts.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.