Abstract
Relation Extraction (RE) is a crucial step to complete Knowledge Graph (KG) by recognizing relations between entity pairs. However, it usually suffers from the long-tail issue, especially when using distantly supervision algorithm. In this paper, inspired by the rich semantic correlations between head relations and tail relations, we proposed a knowledge-aware hierarchical attention (KA-HATT) relation extraction model. According to relational hierarchy, the multiple layers of attention were established, which take advantage of the knowledge from data-rich classes to boost the performance of data-poor classes at the tail. We have conducted extensive experiments on available dataset New York Times (NYT). Experimental results show that, compared with baseline models, our model achieves significant improvements on relation extraction, especially for long-tail relations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.