Abstract

A new knowledge based probabilistic dependency parsing (KPDP) is presented to compensate the local optimization problem of native probabilistic models. KPDP is composed of two stages: (1) selecting a set of constituent parse trees with an extensive bottom-up chart parsing algorithm which employs Maximum Entropy Models to calculate single arc probabilities; (2) finding the best parsing tree with the help of word knowledge. Based on case grammar theory, the word knowledge is represented as some patterns which group those arcs with the same head. KPDP is evaluated experimentally using the dataset distributed in CoNLL 2008 share-task. An unlabelled arc score of 87 % is reported, which is 3.39% higher than the native model without word knowledge. This work will contribute to and stimulate other researches in the field of parsing.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call