Abstract

Annotation of software requirements documents is performed by experts during the requirements analysis phase to extract crucial knowledge from informally written textual requirements. Different annotation tasks target the extraction of different types of information and require the availability of experts specialized in the field. Large scale annotation tasks require multiple experts where the limited number of experts can make the tasks overwhelming and very costly without proper tool support. In this paper, we present our annotation tool, LASR, that can aid the tasks of requirements analysis by attaining more accurate annotations. Our evaluation of the tool demonstrate that the annotation data collected by LASR from the trained non-experts can help compute gold-standard annotations that strongly agree with the true gold-standards set by the experts, and therefore eliminate the need of conducting costly adjudication sessions for large scale annotation work.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.