Abstract

Joint entity and relation extraction has achieved impressive advances in NLP, such as document understanding and knowledge graph construction. The typical methods for entity and relation extraction typically break down the joint task into several smaller components or stages for ease of implementation, but this leads to a loss of the interconnected knowledge in the triple. Hence, we propose to model the triple in one module jointly. Furthermore, the labeling of a joint entity and relation extraction tasks is costly and domain-specific; therefore, it is important to improve its performance on low-resource data and domain adaption. To address this issue, we suggest using two sources that are rich in information, namely pre-trained models on large data and multi-domain text corpora. Pretraining allows us to provide the model with the fundamental ability to perform joint entity and relationship extraction. Second, through meta-learning on multi-domain text, we can improve the model's generalization capabilities, enabling it to perform well even with limited data. We present MTL-JER, a Meta-Transfer Learning method for Joint Entity and Relation Extraction in low-resource settings in this paper. Using exhaustive experiments on five datasets, we prove that our model obtains optimal results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call