Abstract

Richer diagnostic information about examinees’ cognitive strength and weaknesses are obtained from cognitively diagnostic assessments (CDA) when a proper cognitive diagnosis model (CDM) is used for response data analysis. To do so, researchers state that a preset cognitive model specifying the underlying hypotheses about response data structure is needed. However, many real data CDM applications are adds-on to simulation studies and retrofitted to data obtained from non-CDAs. Such a procedure is referred to as retrofitting, and fitting CDMs to traditional test data is not uncommon. To deal with a major validity concern of item/test bias in CDAs, some recent DIF detection techniques compatible with various CDMs have been proposed. This study employs several DIF detection techniques developed based on CTT, IRT, and CDM frameworks and compares the results to understand the extent to which DIF flagging behavior of items is affected by retrofitting. A secondary purpose of this study is to gather evidence about test booklet effects (i.e., item ordering) on items’ psychometric properties through DIF analyses. Results indicated severe DIF flagging prevalence differences for items across DIF detection techniques employing Wald test, Raju’s area measures, and Mantel-Haenzsel statistics. The largest numbers of DIF cases were observed when the data were retrofitted to a CDM. The results further revealed that an item might be flagged as DIF in one booklet, whereas it might not be flagged in another.

Highlights

  • In educational practice, many large-scale tests focus on summative assessment, and their formative features are limited

  • Examinee responses obtained from such assessment procedures may be analyzed via statistical models known as cognitive diagnosis models (CDMs)

  • This study especially focused on the variation in Differential item functioning (DIF) analysis results when the data were retrofitted to a CDM such as DINA model

Read more

Summary

Introduction

Many large-scale tests focus on summative assessment, and their formative features are limited. Tests developed to diagnose examinees’ strengths and weaknesses may provide rich information toward formative assessment and are referred to as cognitively diagnostic assessments (de la Torre & Minchen, 2014). Examinee responses obtained from such assessment procedures may be analyzed via statistical models known as cognitive diagnosis models (CDMs). Such diagnostic information may be considered as valuable feedback for students, teachers, and educational programs. Rather than being just a coarse indicator of how examinees think about and complete educational tasks, CDM enables practitioners to identify and report finer grained attributes examinees use to complete such tasks

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call