Abstract

Joint entity and relation extraction has achieved impressive advances in NLP, such as document understanding and knowledge graph construction. The typical methods for entity and relation extraction typically break down the joint task into several smaller components or stages for ease of implementation, but this leads to a loss of the interconnected knowledge in the triple. Hence, we propose to model the triple in one module jointly. Furthermore, the labeling of a joint entity and relation extraction tasks is costly and domain-specific; therefore, it is important to improve its performance on low-resource data and domain adaption. To address this issue, we suggest using two sources that are rich in information, namely pre-trained models on large data and multi-domain text corpora. Pretraining allows us to provide the model with the fundamental ability to perform joint entity and relationship extraction. Second, through meta-learning on multi-domain text, we can improve the model's generalization capabilities, enabling it to perform well even with limited data. We present MTL-JER, a Meta-Transfer Learning method for Joint Entity and Relation Extraction in low-resource settings in this paper. Using exhaustive experiments on five datasets, we prove that our model obtains optimal results.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.