Abstract

Achievement gaps between students with low and high language proficiency appear for word problems, but is this due to their text format or their conceptual challenges? A test with percent problems of different types and in pure, text and visual format was conducted with N=308 seventh graders. Students’ scores were analyzed statistically by a cognitive diagnosis model. Unlike expected, the probability for students with low language proficiency to solve items in text format is not lower than in pure format. These results are interpreted as indication that conceptual challenges might impact stronger than reading challenges.

Highlights

  • Large-scale assessment studies have repeatedly documented achievement gaps for language minority students (Martiniello, 2008; Abedi, 2006; Haag et al, 2013) or socially disadvantaged students with low language proficiency, even if speaking the majority language (Prediger et al, 2013; Walzebug, 2014)

  • Items, which posed high difficulties to students with low LP could not be characterized by reading challenges, but rather by conceptual or process-oriented challenges, which was confirmed in interview studies. These findings suggest investigating the role of the text format in comparison to other formats but the same conceptual challenges

  • The overall average performance of students on the Percent-Cross-Test clearly indicates that students, both with high and low language proficiency, lack conceptual understanding to solve items involving percentages, but with differences

Read more

Summary

Introduction

Large-scale assessment studies have repeatedly documented achievement gaps for language minority students (Martiniello, 2008; Abedi, 2006; Haag et al, 2013) or socially disadvantaged students with low language proficiency, even if speaking the majority language (Prediger et al, 2013; Walzebug, 2014). It seems likely to trace these language gaps back to word problems and their language demands (Duarte et al, 2011), little is known whether it is really the text format of an item which disadvantages the language learners or its inherent. As most studies disentangling obstacles for language learners investigate complete assessments with innermathematical procedural as well as context items, there is a risk of confounding language demands and conceptual demands (e.g. in Martiniello, 2008; Wolf & Leon, 2009). That is why the study presented constructed a test with items of comparable conceptual demands, but in different formats. If test items with similar conceptual demands are posed in pure format, text format or visual format, do students of low language proficiency really have more difficulties with the text format?.

Language gaps and word problems
Students’ performance with regard to different problem formats
Conceptual demands posed by different problem types – The case of percentages
Research Questions
Construction of the main instrument
Other measures
Sampling and subsampling
Data Analysis
Background of Cognitive Diagnosis Models
The CDM DINA Model
Q-matrix of present model and derived parameters
Model fit and item related parameters
Students’ performances in different problem formats and types
Differences between the subsamples with high and low language proficiency
Discussion
Notes on contributors

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.