Abstract
A method to evaluate software documentation and aid in the subsequent redesign of the documentation is being evaluated. The method is based on end-user feedback in the form of critical incident data. The objective of this technique is to use end-user evaluations to assess common problems and/or requirements for software documentation. In the evaluation, subjects were asked to perform a benchmark task consisting of 19 subtasks and to use the associated software documentation. Both hardcopy and online documentation were available. After subjects completed each subtask in the benchmark task, they were asked to use an online questionnaire to report critical incidents encountered in using the hardcopy and online documentation. The critical incidents reported were sorted into four categories: online documentation failure incidents, online documentation success incidents, hardcopy documentation failure incidents, and hardcopy documentation success incidents. The incidents in each failure category were reviewed to identify common documentation features or elements that caused problems. The same process was repeated for incidents categorized as successful to determine satisfactory features of the documentation. The problems were arranged in descending order from most critical to least critical by frequency of critical incidents associated with each problem. Ties in frequency were broken by an average severity index. Average severity was calculated by averaging the incident severity ratings supplied by users at the time each incident was reported. A list of documentation problems and satisfactory features was presented to the software design team to guide the redesign process. This evaluation is expected to validate the critical incident technique as a method for providing software designers with end-user data for revision of software and documentation.KeywordsCritical IncidentInterface ProblemDigital Equipment CorporationOnline ProblemAverage SeverityThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.