Abstract

Pre-trained Language Model (PLM) is a very popular topic in natural language processing (NLP). It is the rapid development of pre-trained language models (PLMs) that has led to the achievements of natural language today. In this article, we give a review of important PLMs. First, we generally introduce the development history and achievements of PLMs. Second, we present several extraordinary PLMs, including BERT, the variants of BERT, Multimodal PLMs, PLMs combined with Knowledge Graph and PLMs applied to natural language generation. In the end, we summarize and look into the future of PLMs. We expect this article will provide a practical guide for learners to understanding, using and developing PLMs with the abundant literature existing for various NLP tasks. (Abstract)

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call