Abstract

Although distant supervised relation extraction has been widely used, its performance is weakened by the wrong labeling problem. During eliminating noise, many previous work failed to grasp the trade-off between long dependencies and computational complexity while using neural networks, and ignored the potential noise in external knowledge. In this paper, we propose a neural model incorporating dilated convolutional neural networks with soft entity type constraints to jointly tackle the above two issues. Rather than using traditional convolutional neural networks or recurrent neural networks, we propose dilated convolutional networks as the sentence encoder so as to capture long dependencies in large scale context and improve the robustness against local noise while keeping the efficiency of computation. Furthermore, we utilize entity types, which is considerd as external knowledge with noise, to denoise in relation classification by taking the constraints between entity types and relations into account and learning more precise entity types and attention weights simultaneously. Experimental results on the New York Times dataset show that, our model incorporating dilated convolutional networks with soft entity type constraints promotes distant supervised relation extraction and achieves state-of-art results compared with baselines.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call