Abstract

Valid and reliable assessment of students' knowledge and skills is integral to dental education. However, most faculty members receive no formal training on student assessment techniques. The aim of this study was to quantify the value of a professional development program designed to improve the test item-writing skills of dental faculty members. A quasi-experimental (pretest, intervention, posttest) study was conducted with faculty members in the dental school of Majmaah University, Saudi Arabia. Data assessed were 450 multiple-choice questions (MCQs) from final exams in 15 courses in 2017 (prior to the intervention; pretest) and the same number in 2018 (after the intervention; posttest). The intervention was a faculty development program implemented in 2018 to improve the writing of MCQs. This training highlighted construct-irrelevant variance-the abnormal increase or decrease in test scores due to factors extraneous to constructs of interest-and provided expert advice to rectify flaws. Item analysis of pre- and post-intervention MCQs determined the difficulty index, discrimination index, and proportion of non-functional distractors for each question. MCQs on 2017 and 2018 exams were compared on each of these parameters. The results showed statistically significant improvements in MCQs from 2017 to 2018 on all parameters. MCQs with low discrimination decreased, those with high discrimination increased, and the proportion of questions with more than two non-functional distractors were reduced. These results provide evidence of improved test item quality following implementation of a long-term faculty development program. Additionally, the findings underscore the need for an active dental education department and demonstrate its value for dental schools.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call