Abstract

Distantly supervised relation extraction has been widely used to find novel relational facts between entities from text, and can be easily scaled to very large corpora. Previous studies on neural relation extraction treat this task as a multi-instance learning problem, and encode the sentences in low-dimensional spaces via neural networks. Although great progress has been made, they seldom consider the information represented by entities, which are of great significance to relation extraction. In this article, we propose several methods based on different tree-based models to learn syntax-aware entity representations for neural relation extraction. First, we encode the context of entities on dependency trees as sentence-level entity embedding based on tree-structured neural network models. Then, we utilize inter-sentence attention mechanism to obtain sentence bag level entity embedding over all sentences containing the specified entity pair. Finally, we combine both sentence embedding and entity embedding for relation classification. Experimental results on a widely used real-world dataset indicate that our system performs better than the state-of-the-art systems of relation extraction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call