Abstract
Semantic matching is one of the critical technologies for intelligent customer service. Since Bidirectional Encoder Representations from Transformers (BERT) is proposed, fine-tuning on a large-scale pre-training language model becomes a general method to implement text semantic matching. However, in practical application, the accuracy of the BERT model is limited by the quantity of pre-training corpus and proper nouns in the target domain. An enhancement method for knowledge based on domain dictionary to mask input is proposed to solve the problem. Firstly, for modul input, we use keyword matching to recognize and mask the word in domain. Secondly, using self-supervised learning to inject knowledge of the target domain into the BERT model. Thirdly, we fine-tune the BERT model with public datasets LCQMC and BQboost. Finally, we test the model’s performance with a financial company’s user data. The experimental results show that after using our method and BQboost, accuracy increases by 12.12% on average in practical applications.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.