Abstract

AbstractMultiple‐choice (MC) items are widely used in educational tests. Distractor analysis, an important procedure for checking the utility of response options within an MC item, can be readily implemented in the framework of item response theory (IRT). Although random guessing is a popular behavior of test‐takers when answering MC items, none of the existing IRT models for distractor analysis have considered the influence of random guessing in this process. In this article, we propose a new IRT model to distinguish the influence of random guessing from response option functioning. A brief simulation study was conducted to examine the parameter recovery of the proposed model. To demonstrate its effectiveness, the new model was applied to the mathematics tests of the Hong Kong Diploma of Secondary Education Examination (HKDSE) from 2015 to 2019. The results of empirical analyses suggest that the complexity of item contents is a key factor in inducing students’ random guessing. The implications and applications of the new model to other testing situations are also discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call