Abstract

Negation and speculation scope detection is an important task in natural language processing. Previous studies all show that syntactic information is crucial to the task. However, these work mainly focuses on human-designed discrete features and local features extracted from dependency tree, limiting the performance of the task. In this paper, we propose a recursive neural network sequence labeling model, representing whole dependency tree globally and learning automatically syntactic features, for the task. Specifically, recursive neural network first learns a high-level representation for the words in the context of each sentence, and captures the cue word with its target scope through the global dependency structure. Then, CRFs layer takes the representation from recursive neural network as input to jointly decode labels for the whole sentence. Experimental results on English dataset BioScope and Chinese dataset CNeSp show that our model outperforms the state-of-the-art systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call