Abstract

In this work, we propose to use distributed word representations in a greedy, transition-based dependency parsing framework. Instead of using a very large number of sparse indicator features, the multinomial logistic regression classifier employed by the parser learns and uses a small number of dense features, therefore it can work very fast. The distributed word representations are produced by a continuous skip-gram model using a neural network architecture. Experiments on a Vietnamese dependency treebank show that the parser not only works faster but also achieves better accuracy in comparison to a conventional transition-based dependency parser.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call