Abstract

The aim of this study was to examine the validity of a simulator test designed to evaluate focused assessment with sonography for trauma (FAST) skills. Participants included a group of ultrasound novices (n = 25) and ultrasound experts (n = 10). All participants had their FAST skills assessed using a virtual reality ultrasound simulator. Procedural performance on the 4 FAST windows was assessed by automated simulator metrics, which received a passing or failing score. The validity evidence for these simulator metrics was examined by a stepwise approach according to the Standards for Educational and Psychological Testing. Metrics with validity evidence were included in a simulator test, and the reliability of test scores was determined. Finally, a pass/fail level for procedural performance was established. Of the initial 55 metrics, 34 (61.8%) had validity evidence (P < .01). A simulator test was constructed based on the 34 metrics with established validity evidence, and test scores were calculated as percentages of the maximum score. The median simulator test scores were 14.7% (range, 0%-47.1%) and 94.1% (range, 94.1%-100%) for novices and experts, respectively (P < .001). The pass/fail level was determined to be 79.7%. The performance of FAST examinations can be assessed in a simulated setting using defensible performance standards, which have both good reliability and validity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call