Abstract
The overall discovery rates, which are the ratios of sum of unique usability problems detected by all experiment participants against the number of usability problems existed in the evaluated systems, were investigated to find significant factors of usability evaluation through a meta-analytic approach with the n-corrected effect sizes newly defined in this study. Since many studies of usability evaluation have been conducted under specific contexts showing some mixed findings, usability practitioners need holistic and more generalized conclusions. Due to the limited applicability of the traditional meta-analysis to usability evaluation studies, a new meta-analytic approach was established and applied to 38 experiments that reported overall discovery rates of usability problems as a criterion measure. Through the meta-analytic approach with the n-corrected effect sizes, we successfully combined 38 experiments and found evaluator's expertise, report type, and interaction between usability evaluation method and time constraint as significant factors. We suggest that in order to increase overall discovery rates of usability problems, (a) free-style written reports are better than structured written reports; (b) when heuristic evaluation or cognitive walkthrough is used, the usability evaluation experiments should be conducted without time constraint, but when think-aloud is used, time constraint is not an important experimental condition; (c) usability practitioners do not need to be concerned about unit of evaluation, fidelity of evaluated systems, and task type; and (d) HCI experts are better than novice users or evaluators. Our conclusions can guide usability practitioners when determining evaluation contexts, and the meta-analytic approach of this study provides an alternative way to combine the empirical results of usability evaluation besides the traditional meta-analysis.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Human–Computer Interaction
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.