Abstract

Linguistic structures capture varying degrees of information in natural language text, for instance, from simple per word part-of-speech tags to more complicated phrases bearing semantic roles. In Natural Language Processing, it is common to use a pipeline approach, where simpler tasks or structures are used as inputs to more complicated problems. We explore the Semantic Role Labeling pipeline which is composed of part-of-speech tagging, syntactic parsing, semantic argument identification, and semantic role classification. In particular, we propose generative models for dependency parsing and semantic role classification, which use inter-connected latent variables to encode the parsing history in the first model, and role correlations in the second model. The dependency parsing model is based on Temporal Restricted Boltzmann Machines, where we show its effectiveness in reducing the feature engineering effort. The semantic role classification model is a Bayesian model, where we demonstrate a way to combine local constituent features with global role correlations in a unified framework.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call