Simulation-based education is a recognized way of developing medical competencies, and there is overwhelming scientific evidence to support its efficacy. However, it is still underused, which can often be related to poor implementation process. In addition, best practices for implementation of simulation-based courses based on implementation science are not widely known nor applied. The purpose of this study was to develop a rubric, the Implementation Quality Rubric for Simulation (IQR-SIM), to evaluate the implementation quality of simulation-based courses. A 3-round, modified Delphi process involving international simulation and implementation experts was initiated to gather and converge opinions regarding criteria for evaluating the implementation quality of simulation-based courses. Candidate items for Round 1 were developed based on the Adapted Implementation Model for Simulation. Items were revised and expanded to include descriptive anchors for evaluation in Round 2. Criterion for inclusion was 70% of respondents selecting an importance rating of 4 or 5/5. Round 3 provided refinement and final approval of items and anchors. Thirty-three experts from 9 countries participated. The initial rubric of 32 items was reduced to 18 items after 3 Delphi rounds, resulting in the IQR-SIM: a 3-point rating scale, with nonscored options "Don't know/can't assess" and "Not applicable," and a comments section. The IQR-SIM is an operational tool that can be used to evaluate the implementation quality of simulation-based courses and aid in the implementation process to identify gaps, monitor the process, and promote the achievement of desired implementation and learning outcomes.
Read full abstract