Abstract

Large-scale international assessment studies such as Trends in International Mathematics and Science Study (TIMSS) or Programme for International Student Assessment (PISA) provide researchers and policy makers the opportunity to conduct secondary analyses to answer questions related to educational outcomes and compare the impact of certain inputs on student outcomes across countries. These comparisons are made under the assumption that the questionnaire items translated to different languages are understood in the same way by its participants. Presenting a case from Turkey, this paper shows that equivalency of questionnaire items is not always achieved. The case explores demographic information related to teacher preparation and the sample is drawn from eighth grade science and mathematics teachers participated in TIMSS 2007, 2011, and 2015 in Turkey. Descriptive analysis of data collected from these teachers and comparisons across subjects and years show that teachers may have misunderstood a question regarding their major, thus limiting potential claims related to teacher preparation in Turkey. Researchers and policy analyst who use secondary data collected by international assessment studies should be aware of such comparability issues in adapted items prior to conducting any secondary analyses.

Highlights

  • The rich collection of information enables policy-makers and researchers to conduct secondary analyses to understand the ways and extent to which various factors are related to important student outcomes such as scientific literacy or self-efficacy in mathematics and how these associations change across countries and over time (Anderson et al 2007; Ferrini-Mundy and Schmidt 2005; Fertig 2003)

  • Drawing from Bray et al (2020) recent study that showed major adaptation differences in different versions of items related to students’ outside-of-school educational activities in Programme for International Student Assessment (PISA) that resulted in variations in interpretations of these items across countries, this paper presents a case from Trends in International Mathematics and Science Study (TIMSS) where differences in meaning between the Turkish and English versions of a teacher questionnaire item in TIMSS resulted in inconsistent responses from the participating science and mathematics teachers in Turkey

  • Presenting a case from students’ out-of-school activities questionnaire items in PISA, Bray et al (2020) show that it may even be true for measuring objective attributes

Read more

Summary

Introduction

Large-scale international assessment studies such as Trends in International Mathematics and Science Study (TIMSS) or Programme for International Student Assessment (PISA) are critical evidence for charting educational progress as well as in shaping educational policies of countries (Ababneh et al 2016; Klemencic 2010; Lockheed and Wagemaker, 2013; Paine and Zeichner 2012; Sjoberg 2015; Tobin et al 2015). Drawing from Bray et al (2020) recent study that showed major adaptation differences in different versions of items related to students’ outside-of-school educational activities in PISA that resulted in variations in interpretations of these items across countries, this paper presents a case from TIMSS where differences in meaning between the Turkish and English versions of a teacher questionnaire item in TIMSS resulted in inconsistent responses from the participating science and mathematics teachers in Turkey.

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call