Abstract

Usability evaluation both by experts and target users is an integral part of the process of developing and assessing digital solutions. Usability evaluation improves the probability of having digital solutions that are easier, safer, more efficient, and more pleasant to use. However, despite the widespread recognition of the importance of usability evaluation, there is a lack of research and consensus on related concepts and reporting standards. The aim of the study is to generate consensus on terms and procedures that should be considered when planning and reporting a study on a usability evaluation of health-related digital solutions both by users and experts and provide a checklist that can easily be used by researchers when conducting their usability studies. A Delphi study with 2 rounds was conducted with a panel of international participants experienced in usability evaluation. In the first round, they were asked to comment on definitions, rate the importance of preidentified methodological procedures using a 9-item Likert scale, and suggest additional procedures. In the second round, experienced participants were asked to reappraise the relevance of each procedure informed by round 1 results. Consensus on the relevance of each item was defined a priori when at least 70% or more experienced participants scored an item 7 to 9 and less than 15% of participants scored the same item 1 to 3. A total of 30 participants (n=20 females) from 11 different countries entered the Delphi study with a mean age of 37.2 (SD 7.7) years. Agreement was achieved on the definitions for all usability evaluation-related terms proposed (usability assessment moderator, participant, usability evaluation method, usability evaluation technique, tasks, usability evaluation environment, usability evaluator, and domain evaluator). A total of 38 procedures related to usability evaluation planning and reporting were identified across rounds (28 were related to usability evaluation involving users and 10 related to usability evaluation involving experts). Consensus on the relevance was achieved for 23 (82%) of the procedures related to usability evaluation involving users and for 7 (70%) of the usability evaluation procedures involving experts. A checklist was proposed that can guide authors when designing and reporting usability studies. This study proposes a set of terms and respective definitions as well as a checklist to guide the planning and reporting of usability evaluation studies, constituting an important step toward a more standardized approach in the field of usability evaluation that may contribute to enhancing the quality of planning and reporting usability studies. Future studies can contribute to further validating this study work by refining the definitions, assessing the practical applicability of the checklist, or assessing whether using this checklist results in higher-quality digital solutions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call