Abstract

Open Relation Extraction (ORE) task remains a challenge to obtain a semantic representation by discovering arbitrary relations from the unstructured text. Conventional methods heavily depend on feature engineering or syntactic parsing, which are inefficient or error-cascading. Recently, leveraging supervised deep learning methods to address the ORE task is a promising way. However, there are two main challenges: (1) The lack of enough labeled corpus to support supervised training; (2) The exploration of specific neural architecture that adapts to the characteristics of open relation extracting. In this paper, we build a large-scale, high-quality training corpus in a fully automated way. And wedesign a tagging scheme to assist in transforming the ORE task into a sequence tagging processing. Furthermore, we propose a hybrid neural network model (HNN4ORT) for open relation tagging. The model employs the Ordered Neurons LSTM to encode potential syntactic information to capture the associations among the arguments and relations. It also emerges a novel Dual Aware Mechanism, including Local-aware Attention and Global-aware Convolution. The dual awarenesses complement each other. Takes the sentence-level semantics as a global perspective, and at the same time, the model implements salient local features to achieve sparse annotation. Experiment results on various testing sets show that our model achieves state-of-the-art performance compared toconventional methods or other neural models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call