Abstract

Most medical schools test their students throughout the curriculum using in-house examinations written by the faculty who teach the courses. The authors assessed the quality of in-house examinations used in three U.S. medical schools. In 1998, nine basic science examinations from the three schools were gathered and each question was subjected to quality assessment by three expert biomedical test developers, each of whom has had extensive experience in reviewing and evaluating questions for the United States Medical Licensing Examination (USMLE) Steps 1 and 2. Questions were rated on a five-point scale: 1 = tested recall only and was technically flawed to 5 = used a clinical or laboratory vignette, required reasoning to answer, and was free of technical flaws. Each rater made independent assessments, and the mean score for each question was calculated. Mean quality scores for National Board of Medical Examiners (NBME) who were trained question writers were compared with the mean scores for question writers without NBME training. The raters' quality assessments were made without knowledge of the test writers' training background or the study's hypothesis. A total of 555 questions were analyzed. The mean score for all questions was 2.39 +/- 1.21. The 92 questions written by NBME-trained question writers had a mean score of 4.24 +/- 0.85, and the 463 questions written by faculty without formal NBME training had a mean score of 2.03 +/- 0.90 (p <.01). The in-house examinations were of relatively low quality. The quality of examination questions can be significantly improved by providing question writers with formal training.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call