Abstract

The textual semantics contained in the PLM (Pre-trained Language Model) is constrained by the text distribution in the original training corpus. Due to the lack of sufficient contextual training corpus, the low-frequency word representations in the PLM often have difficulty capturing their actual semantics. Previous research has shown that using semantic information from dictionaries can alleviate this problem. Unfortunately, these works neglected the infinite potential of example sentences from different target words with various meanings. To re-explore methods for enhancing PLM using the dictionary, we propose a novel Comprehensive Dictionary-based tuning approach integrating the latest Prompt learning (DictPrompt). We first collect a dataset based on the Oxford Advanced Learner’s English Dictionary. Then, we designed a set of comprehensive prompt templates with the corpus combining the word, the definition, and its example sentence. Finally, we insert a word game training task between pre-training and fine-tuning using these templates, allowing the model to inject more semantic information into PLM. We test our Dictprompt tuning method on three commonly used PLMs. The testing results on five fine-grained semantic tasks show that our dictionary-based secondary tuning can bring additional gains to the model’s performance. The best accuracy improves 3.09% on average with our tuning on the WiC task and 7.93% on the WSC task. We also plot the sentence embedding scatters of polysemy words. Our method can smooth the decision boundary and help the model output more distinguishable embedding. The code is available at https://github.com/xbdxwyh/Dictprompt.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.