Abstract

Relation extraction between entity pairs is an increasingly critical area in natural language processing. Recently, the pre-trained bidirectional encoder representation from transformer (BERT) performs excellently on the text classification or sequence labelling tasks. Here, the high-level syntactic features that consider the dependency between each word and the target entities into the pre-trained language models are incorporated. Our model also utilizes the intermediate layers of BERT to acquire different levels of semantic information and designs multi-granularity features for final relation classification. Our model offers a momentous improvement over the published methods for the relation extraction on the widely used data sets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call