Abstract

The semantic vector representation of sentences in specific domain has important uses and plays an important role in question answering and retrieval. However, the semantic representation of sentences obtained based on neural networks requires sufficient training data. And a specific domain may lack data for semantic training, and it is difficult to obtain such data. And there may already be a lot of data for training languages in many other specific domains. This paper attempts to use these semantic training data independent of the target domain to train a semantic model for the target domain to help improve the semantic representation of this domain. Experiments show that using a BERT-based semantic representation model, trained with non-relevant domain data, can effectively improve the semantic vector representation of target specific domain.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.