Abstract

This paper presents our solution for jointly parsing of syntactic and semantic dependencies. The Maximum Entropy (ME) classifier is selected in this system. Also the Mutual Information (MI) model was utilized into feature selection of dependency labeling. Results show that the MI model allows the system to get better performance and reduce training hours.

Highlights

  • Since 2002, semantic role labeling which focus on recognizing and labeling semantic arguments has received considerable interests because of its big contribution to many Natural Language Processing (NLP) applications, such as information extraction, question answering, machine translation, paraphrasing, and etc.In the past four years, the Conference on Computational Natural Language Learning (CoNLL) featured an associated share task every year which allow the participants to train and test their Semantic Role Labeling (SRL) or Syntactic systems on the same date sets and share their experiences

  • Suppose (p, d) is a couple of predicate and one of its possible dependents, T is the dependency tree generated by syntactic parsing; L is the set of semantic dependency labels, null is included as a special tag for no semantic dependency between p and d

  • We present a semantic dependency system, which includes syntactic module, predicate tagging module and dependency labeling module

Read more

Summary

Introduction

Since 2002, semantic role labeling which focus on recognizing and labeling semantic arguments has received considerable interests because of its big contribution to many Natural Language Processing (NLP) applications, such as information extraction, question answering, machine translation, paraphrasing, and etc. In 2005 share task, up to 8 teams used the Maximum Entropy (ME) statistical framework (Che, Liu, Li, Hu, & Liu, 2005; Haghighi, Toutanova, & Manning, 2005; Park & Rim, 2005; Sang, Canisius, van den Bosch, & Bogers, 2005; Sutton & McCallum, 2005; Tsai, Wu, Lin, & Hsu, 2005; Venkatapathy, Bharati, & Reddy, 2005; Yi & Palmer, 2005). For these reasons,we propose a solution that selects Maximum Entropy model. Different from prior systems, mutual information method is selected to decrease the number of features and reduce the train time efficiently with little impact on result

Problem Definition
Maximum Entropy Model
Corpus and Evaluation
System Architecture
Syntactic parsing
Predicate Tagging
Semantic Dependency labeling
Mutual Information
Conclusion
Position of Dependent 4 POS pair
Findings
Predicate Information 7 Predicate Voice 8 Dependent Information 9 Path POS

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.