Abstract

This study explored the impact of homogeneity of answer choices on item difficulty and discrimination. Twenty-two matched pairs of elementary and secondary mathematics items were administered to randomly equivalent samples of students. Each item pair comparison was treated as a separate study with the set of effect sizes analyzed using meta-analysis and a moderator analysis. The results show that multiple-choice (MC) items with homogeneous answer choices tend to be easier than MC items with nonhomogeneous answer choices, but the magnitude was related to item content (algebra vs. geometry) and answer choice construction strategy. For algebra items, items with homogeneous answer choices are easier than those with nonhomogeneous answer choices. However, the difficulty of geometry items with homogeneous and nonhomogeneous is not statistically different. Taking into account answer choice construction strategy, the findings showed that items with homogeneous answer choices were easier than items with nonhomogeneous answer choices when different strategy was applied. However, the same construction strategy was applied; thus, the difficulty of items with homogeneous answer choices and nonhomogeneous answer choices was not statistically different. In addition, we found that item discrimination does not significantly change across MC items with homogeneous and nonhomogeneous answer choices.

Highlights

  • If construct-irrelevant artifacts of the test-development process interfere with the validity of inferences made from test scores, the entire testing enterprise is at risk

  • This study examines the impact of homogeneity of answer choices only for mathematics MC items

  • Unlike previous studies that explored the impact of answer choice homogeneity on item difficulty for the items with word-based answer choices, the current study examines the impact of answer choice homogeneity on item difficulty and item discrimination of mathematics items with numerical answer choices

Read more

Summary

Introduction

If construct-irrelevant artifacts of the test-development process interfere with the validity of inferences made from test scores, the entire testing enterprise is at risk. Relatively few empirical studies have examined the impact of item-writing guidelines on test performance. Previous studies have suggested that forming answer choices is a difficult part of the item-construction process (Haladyna & Downing, 1989; Hansen & Dexter, 1997), because each individual answer choice can potentially influence item quality. Haladyna and his colleagues (2002) proposed 31 itemwriting guidelines for creating high-quality MC items that contribute to overall test reliability and validity. The authors examined three commonly cited guidelines related to constructing answer choices: No.23 “Keep choices homogeneous,” No.29 “Make all distracters plausible,” and No.30 “Use typical errors of students.”

Objectives
Methods
Results

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.