Abstract

ICL-HD at SemEval-2016 Task 8: Meaning Representation Parsing - Augmenting AMR Parsing with a Preposition Semantic Role Labeling Neural Network

Highlights

  • Progress in Natural Language Processing has led to a multitude of well-motivated tasks that each represent part of a sentence’s meaning but result in a meaning description spread over separate, unconnected descriptions

  • Abstract Meaning Representation (AMR) structures are organized with nodes representing concepts and the semantic relationships that hold between these concepts1

  • An example for a AMR graph is given in Figure 1: there is a concept RECOMMEND01 which is the root of the graph and there is a concept OFFER-01 that stands in semantic relationship to RECOMMEND-01 with the edge ARG1

Read more

Summary

Introduction

Progress in Natural Language Processing has led to a multitude of well-motivated tasks that each represent part of a sentence’s meaning but result in a meaning description spread over separate, unconnected descriptions. These separate levels of semantic annotation, like co-reference or named entities, and the lack of simple human-readable corpora where whole sentence meanings are encoded led to the Abstract Meaning Representation (AMR) formalism (Banarescu et al 2013). AMR structures capture sentence meanings with rooted, directed and labeled graphs where sentences with the same meaning receive the same AMR These graphs are encoded in a bracketed format and can be visually represented in a human-understandable way An example for a AMR graph is given in Figure 1: there is a concept RECOMMEND01 which is the root of the graph and there is a concept OFFER-01 that stands in semantic relationship to RECOMMEND-01 with the edge ARG1

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call