Abstract

Reliably annotated corpora with reliable annotation are a valuable resource for Natural Language Processing, which justifies the search for methods capable of assisting linguistic revision. In this context, we present a study on methods for revising dependency treebanks, investigating the contribution of three different strategies to the corpus review: (i) linguistic rules; (ii) an adaptation of the n-grams method proposed by Boyd et al. (2008) applied to Portuguese; and (iii) Inter-Annotator Disagreement, a linguistically motivated approach that draws inspiration from the human annotation process. The results are promising, and taken together the three methods can lead to the revision of up to 58% of the errors in a specific corpus at the cost of revising only 20% of the corpus. We also present a tool that integrates treebank editing, evaluation and search capabilities with the review methods, as well as a gold-standard Portuguese corpus from the oil and gas domain.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.