There has been significant progress in the field of transfer learning. However, there are still issues with inconsistent results in professional domain applications, with low-resource learning being a considerable problem. This paper proposes a language processing model for historical education built using BERT's pre-training techniques. Two experiments were conducted to obtain comparative results and choose the appropriate model method for explicating implicit expertise in secondary school history teaching. It compares traditional methods, represented by naive Bayes, to popular continuation pre-processing techniques such as domain adaptive learning and task adaptive learning to improve the effectiveness of transfer learning. Finally, this study builds targeted models based on real application needs and selects professional rules consistent with the scene application. The use of continued pre-training helps to enhance the accuracy of the professional domain model.
Read full abstract