EmergenCSim is a novel researcher-developed serious game (SG) with an embedded scoring and feedback tool that reproduces an obstetric operating room environment. The learner must perform general anesthesia for emergent cesarean delivery for umbilical cord prolapse. The game was developed as an alternative teaching tool because of diminishing real-world exposure of anesthesiology trainees to this clinical scenario. Traditional debriefing (facilitator-guided reflection) is considered to be integral to experiential learning but requires the participation of an instructor. The optimal debriefing methods for SGs have not been well studied. Electronic feedback is commonly provided at the conclusion of SGs, so we aimed to compare the effectiveness of learning when an in-person debrief is added to electronic feedback compared with using electronic feedback alone. We hypothesized that an in-person debriefing in addition to the SG-embedded electronic feedback will provide superior learning than electronic feedback alone. Novice first-year anesthesiology residents (CA-1; n=51) (1) watched a recorded lecture on general anesthesia for emergent cesarean delivery, (2) took a 26-item multiple-choice question pretest, and (3) played EmergenCSim (maximum score of 196.5). They were randomized to either the control group that experienced the electronic feedback alone (group EF, n=26) or the intervention group that experienced the SG-embedded electronic feedback and an in-person debriefing (group IPD+EF, n=25). All participants played the SG a second time, with instructions to try to increase their score, and then they took a 26-item multiple-choice question posttest. Pre- and posttests (maximum score of 26 points each) were validated parallel forms. For groups EF and IPD+EF, respectively, mean pretest scores were 18.6 (SD 2.5) and 19.4 (SD 2.3), and mean posttest scores were 22.6 (SD 2.2) and 22.1 (SD 1.6; F1,49=1.8, P=.19). SG scores for groups EF and IPD+EF, respectively, were-mean first play SG scores of 135 (SE 4.4) and 141 (SE 4.5), and mean second play SG scores of 163.1 (SE 2.9) and 173.3 (SE 2.9; F1,49=137.7, P<.001). Adding an in-person debriefing experience led to greater improvement in SG scores, emphasizing the learning benefits of this practice. Improved SG performance in both groups suggests that SGs have a role as independent, less resource-intensive educational tools.
Read full abstract