Abstract

Meaning Representation parsing aims to represent a sentence as a structured, Directed, Acyclic Graph (DAG), in an attempt to extract meaning from text. This paper extends an existing 2-stage pipeline AMR parser with state-of-the-art techniques in dependency parsing. First, Pointer-Generator Networks are used for out-of-vocabulary words in the concept identification stage, with an improved initialization via the use of word-and character-level embeddings. Second, the performance of the Relation Identification module is improved by jointly training the Heads Selection and the Arcs Labeling components. Last, we underline the difficulty of end-to-end training with recurrent modules in a static deep neural network construction approach and explore a dynamic construction implementation, which continuously adapts the computation graph, thus potentially enabling end-to-end training in the proposed pipeline solution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call