Abstract

Cognitive diagnostic assessment (CDA) is used to study cognitive and educational psychology, and designed to diagnose the underlying abilities of test takers in comprehension language skills such as reading comprehension. Through applying CDA, a test has undergone accurate studies to remove biased test items which yield great impact on individuals, educational systems and societies. In this case, psychometric statistical analyses were applied, Differential Attribute Functioning (DAF)) was also used to detect the probability of the mastery of attributes among test takers, and Differential Item Functioning (DIF) was estimated to show item performance among different candidates in terms of gender, their GPAs in BA, and MA degrees. The randomly selected participants of this study were 7,420 females and males sitting for the nationwide PhD admission test to pursue their education in Applied Linguistics. Moreover, a Q-matrix was developed, data were fed into R studio software, and the Generalized Deterministic Inputs, Noisy “and” Gate (GDINA) model was run. The results of the study flagged large DIF in gender group in 2019; and in gender, BA, and MA groups in 2020. In sum, this study is an attempt to raise the awareness of test developers to shed light on the critical discursive sources of inequity and bias. The implication of this study can provide pedagogically useful diagnostic information for test designers and teachers since a proficiency test needs to be valid, reliable, and fair in the context of high-stakes tests so that it would lead to positive changes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call