Abstract
Relation extraction (RE) is an important task and has wide applications. Distant supervision is widely used in RE methods which can automatically construct labeled data to reduce the manual annotation effort. This method usually results in many instances with incorrect labels. In addition, most of existing relation extraction methods merely rely on the textual content of sentences to extract relation. In fact, many knowledge graphs are off-the-shelf and they can provide useful information of entities and relations, which has the potential to alleviate the noisy data problem and improve the performance of relation extraction. In this paper, we propose a knowledge-aware attention model to incorporate the knowledge graph information into relation extraction. In our approach, we first learn the representations of entities and relations from knowledge graph using graph embedding methods. Then we propose a knowledge-aware word attention model to select the informative words in sentences for relation extraction. In addition, we also propose a knowledge-aware sentence attention model to select useful sentences for RE to alleviate the problem of noisy data brought by distant supervision. We conduct experiments on a widely used dataset and the results show that our approach can effectively improve the performance of neural relation extraction.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.