Abstract

Item analysis (IA) is implemented to investigate pass/fail dichotomy, mastery, and non‐mastery of an instructional material or criterion, item difficulty and test takers' ability. This entry delineates conceptual underpinnings and presents practical tips for classical and modern test IA theory with an accentuation of norm‐referenced testing (item facility [IF], item discrimination [ID], and analysis of distractors), criterion‐referenced testing (difference index [DI] and B‐index) and Multi‐Faceted Rasch Model (MFRM) that explores the one‐parameter model of item response theory (IRT). The entry underscores data from reading, grammar, and listening tests to examine the basic premise of IA on keeping, modifying, or weeding out test items.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call