Abstract

We present a novel end-to-end model to jointly extract semantic relations and argument entities from sentence texts. This model does not require any handcrafted feature set or auxiliary toolkit, and hence it could be easily extended to a wide range of sequence tagging tasks. A new method of using the word morphology feature for relation extraction is studied in this paper. We combine the word morphology feature and the semantic feature to enrich the representing capacity of input vectors. Then, an input information enhanced unit is developed for the bidirectional long short-term memory network (Bi-LSTM) to overcome the information loss caused by the gate operations and the concatenation operations in the LSTM memory unit. A new tagging scheme using uncertain labels and a corresponding objective function are exploited to reduce the interference information from non-entity words. Experiments are performed on three datasets: The New York Times (NYT) and ACE2005 datasets for relation extraction and the SemEval 2010 task 8 dataset for relation classification. The results demonstrate that our model achieves a significant improvement over the state-of-the-art model for relation extraction on the NYT dataset and achieves a competitive performance on the ACE2005 dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call