Virtual patients (VPs) are increasingly used in health care professions education to support clinical reasoning (CR) development. However, the extent to which feedback is given across CR components is unknown, and guidance is lacking on how VPs can optimize CR development. This systematic review sought to identify how VPs provide feedback on CR. Seven databases (MEDLINE, EMBASE, CINAHL, ERIC, PsycINFO, Scopus, and ProQuest Dissertations) were searched in March 2023 using terms (e.g., medical education, virtual patient, case-based learning, computer simulation) adapted from a previous systematic review. All studies that described VP use for developing CR in medical professionals and provided feedback on at least 1 CR component were retrieved. Screening, data extraction, and quality assessment were performed. Narrative synthesis was performed to describe the approaches used to measure and provide feedback on CR. A total of 6,526 results were identified from searches, of which 72 met criteria, but only 35 full-text articles were analyzed because the reporting of interventions in abstracts (n = 37) was insufficient. The most common CR components developed by VPs were leading diagnosis (23 [65.7%]), management or treatment plan (23 [65.7%]), and information gathering (21 [60%]). The CR components were explored by various approaches, from redefined questions to free text and concept maps. Studies describing VP use for giving CR feedback have mainly focused on easy-to-assess CR components, whereas few studies have described VPs designed for assessing CR components, such as problem representation, hypothesis generation, and diagnostic justification. Despite feedback being essential for learning, few VPs provided information on the learner's use of self-regulated learning processes. Educators designing or selecting VPs for CR use must consider the needs of learner groups and how different CR components can be explored and should make the instructional design of VPs explicit in published work.
Read full abstract