Abstract

Asymmetric IRT models have been shown useful for capturing heterogeneity in the number of latent subprocesses underlying educational test items (Lee and Bolt, 2018a). One potentially useful practical application of such models is toward the scoring of discrete-option multiple-choice (DOMC) items. Under the DOMC format, response options are independently and randomly administered up to the (last) keyed response, and thus the scheduled number of distractor response options to which an examinee may be exposed (and consequently the overall difficulty of the item) can vary. In this paper we demonstrate the applicability of Samejima's logistic positive exponent (LPE) model to response data from an information technology certification test administered using the DOMC format, and discuss its advantages relative to a two-parameter logistic (2PL) model in addressing such effects. Application of the LPE in the context of DOMC items is shown to (1) provide reduced complexity and a superior comparative fit relative to the 2PL, and (2) yield a latent metric with reduced shrinkage at high proficiency levels. The results support the potential use of the LPE as a basis for scoring DOMC items so as to account for effects related to key location.

Highlights

  • We present a real dataset to which the models can be applied, and demonstrate the observation of item characteristic curves (ICCs) asymmetry in comparing the logistic positive exponent (LPE) model to other item response theory (IRT) models that attend to the effects of the scheduled number of response options

  • Failure to appropriately attend to the asymmetry of the curves can among other things make it more difficult for high ability

  • Our results support the potential benefit of an LPE model in the scoring of test performances using discrete-option multiple-choice (DOMC)

Read more

Summary

Asymmetric IRT and DOMC

One context in which asymmetric models may find particular value is in the scoring of discrete option multiple-choice (DOMC) items (Foster and Miller, 2009). Respondents are scheduled to be administered response options up through the last of the keyed responses; the overall item is scored as correct only if all keyed options are endorsed (and any and all distractor options administered along the way are correctly rejected). More critical to this paper is the effect of the randomized location of the keyed responses, as this randomization implies that the same item will be administered with more scheduled distractor options (and require a correct rejection of more distractors) for some examinees than others. For reasons we explain later in the paper, we argue that the LPE has greater psychological plausibility in such contexts given the conjunctive interaction that occurs between the individual option responses in determining an overall correct response to the item (i.e., an item is scored correct only if all presented distractors are rejected and all keyed options are endorsed). We demonstrate the implications of the asymmetry on the metric properties of the IRT analysis and offer thoughts for future research

ASYMMETRIC IRT MODELS
MODELS FOR DOMC ITEM RESPONSE
MODEL ESTIMATION AND COMPARISON
ILLUSTRATION OF EXAMPLE ITEMS
IMPLICATIONS FOR IRT METRIC
Triple keyed items
CONCLUSION AND DISCUSSION
AUTHOR CONTRIBUTIONS
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.