Abstract

ABSTRACT Programme for International Student Assessment (PISA) provides valuable information to participating countries by allowing them to compare the performance of their students with the performances of students in other countries. When making such comparisons, the construct being measured should be the same across groups at the item and test levels. The purpose of this study was to examine the responses of students to 16 dichotomously scored science items in PISA 2015 for item bias using statistical procedures at the item level. Student data in three Latin American countries (Chile, Costa Rica and Mexico) were used to run the differential item function (DIF) analyses across cultural groups. About 25% of the items for each country pair were detected as DIF items with large effect sizes. Based on the findings of the study, it can be concluded that the possible item bias effect on the validity of the interpretation and the use of PISA 2015 science test results should be considered while comparing the test results across the cultures investigated in this study.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call