Abstract

It is critical to deeply mine semantic features for information extraction. Tree-structured model is a linguistically attractive option due to its linguistic representations of sentence syntactic structure. Tree-LSTM has been introduced to represent tree-structured network topologies for the syntactic properties. To alleviate the limitation of the Tree-LSTM, we work towards addressing the issue by developing gated mechanism variants for the tree-structured network. The gated mechanism is more complex and diverse for the tree-structured model. We apply Child-Sum Tree-LSTM and Child-Sum Tree-GRU for recognizing biomedical event triggers, and develop two new gated mechanism variants incorporating peephole connection and coupled mechanism into the tree-structured model. The experimental results showed the advantage of gated units. The Child-Sum Tree-LSTM achieved the best results among the gated tree-structured models, and the performance of variants is nearly the same as Child-Sum Tree-LSTM. However, Child-Sum Tree-GRU and Child-Sum Tree-coupled reach reduction in computation time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call