Abstract

In the last few years, pre-trained models (PTMS) have become the foundation of the downstream natural language processing tasks. The large scale corpus with abundant latent semantical knowledge in the pre-training tasks makes the model learn the semantics of language. However, the general mask language model is not suitable for corpus with a lot of irrelevant and noisy semantics such as merchant information. In our merchant system, we have collected millions of merchants information, including merchant names and address. To deal with these kind of short and noisy corpus and incorporate multi-source external information into the model, in this paper, we propose a weakly supervise based merchant pre-trained model called MCHPT model to learn representations of merchant-language. The model is pre-trained by our designed pre-training tasks on a large scale weakly supervised real-world merchant dataset. The experiment results present that our model outperforms the state-of-the-art pre-trained language models in four downstream merchant related tasks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.