Abstract

This study reports the psychometric evaluation of an item bank for an Assessment for Learning (AfL) tool to assess primary school students’ reading comprehension skills. A pool of 46 primary 1 to 6 reading passages and their accompanying 522 multiple choice and short answer items were developed based on the Progress in International Reading Literacy Study (PIRLS) assessment framework. They were field-tested at 27 schools in Singapore involving 9834 students aged between 7 and 13. Four main comprehension processes outlined in PIRLS were assessed: focusing on and retrieving explicitly stated information, making straightforward inferences, interpreting and integrating ideas and information, and evaluating and critiquing content and textual elements. Rasch analysis was employed to examine students’ item response patterns for (1) model and item fit; (2) differential item functioning (DIF) about gender and test platform used; (3) local item dependence (LID) within and amongst reading passages; and (4) distractor issues about options within the multiple-choice-type items. Results showed that the data adequately fit the unidimensional Rasch model across all test levels with good internal consistency. Psychometric issues found amongst items were primarily related to ill-functioning distractors and local dependence on items. Problematic items identified were reviewed and subsequently amended by a panel of assessment professionals for future recalibration. This psychometrically and theoretically sound item bank is envisaged to be valuable to developing comprehensive classroom AfL tools that provide information for the English reading comprehension instructional design in the Singaporean context.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call