Abstract

The pre-training language model, BERT has achieved a great success in natural language processing by transferring knowledge from rich-resource pre-training task to the low-resource downstream tasks. The model was recognized as a breakthrough or an innovative technology that changed the paradigm of natural language processing. In this paper, a number of studies have been analyzed to classify and compare the research directions. We examined a technical challenges after BERT. In the pre-training process, self-supervised learning is performed, which relies entirely on training data. If we introduce the linguistic knowledge in the course of the training, it would be possible to get better result more effectively. Therefore, it is necessary to develop a method to insert external knowledge such as linguistic information in the training process. The mask language model and the next sentence prediction are being used in BERT's pre-training tasks. Though, to get much deeper understanding of the natural language, some other effective methods are to be studied and developed. Lastly, we should aim to develop eXplainable Artificial Intelligence (XAI) technology in natural language processing, helping us look into the transparent processing. The pre-trained language model focuses on the development of skills that can be used for all the tasks of natural language understanding. A lot of researches are focused on how to adapt language model effectively to downstream tasks based on common language models, even with the case with little data. It is also hoped that the technical analysis reviewed in this study will provide linguists and computer researchers with an opportunity to understand recent technological achievements in the field of natural language processing and to seek joint research.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.