Abstract

Distant supervised relation extraction is the problem of classifying the semantic relation between a pair of entities in a given sentence, where data is created by aligning unlabelled text with knowledge base automatically. Prior researches indicated that most of the sentences in the distant supervised relation extraction setting are very long and benefit from the attention over words. We note that the shortest dependency path between the entity pair in the syntax analysis tree of a sentence can help word based attention. In this paper, we propose a noval distant supervised neural relation extraction method, which makes use of the shortest dependency path to supervise the learning of word attention. Through extensive experiments on benchmark datasets, we demonstrate effectiveness of our model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call