Abstract

The international education assessment is important for educational improvement when it is used appropriately by participating countries to identify weaknesses of educational systems. However, its usefulness is dependent on validity of the test itself. One aspect of validity issues worth being investigated is the fairness which assesses whether or not assessment items/test is fair to all subgroups of examinees. Differential item functioning (DIF) and test functioning (DTF) methods are usually used to assess fairness. Test items that are not fair will be flagged as DIF. Similarly, tests that are not fair will be flagged as DTF. The objects of this research were to apply differential item functioning and test functioning methods to analyze the extent of differential item functioning and test functioning in science assessment data. The data that were explored in this research was the secondary data from the Programme for International Student Assessment (PISA) in 2009 and was analyzed using Differential item functioning analysis system (DIFAS) version 5.0. It was found that mixed format tests used by PISA favored some groups of examinees over other groups which indicate the degree of unfairness across groups of students with different backgrounds. The recommendation for the assessment of student performance is that DIF items be removed before the score reporting is calculated and that the value added model be used to remove factors that are unfair to different groups of test takers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call