Abstract

Educators need actionable information about student progress during the school year. This paper explores an approach to this problem in the writing domain that combines three measurement approaches intended for use in interim-assessment fashion: scenario-based assessments (SBAs), to simulate authentic classroom tasks, automated writing evaluation (AWE) features to track changes in performance and writing process traits derived from a keystroke log. Our primary goal is to determine if SBAs designed to measure English Language Arts skills, supplemented by richer measurement of the writing task, function well as interim assessments that are sensitive to differences in performance related to differences in quality of instruction. We calibrated these measures psychometrically using data from a prior study and then applied them to evaluate changes in performance in one suburban and two urban middle schools that taught argument writing. Of the three schools, only School A (the suburban school, with the strongest overall performance) showed significant score increases on an essay task, accompanied by distinctive patterns of improvement. A general, unconditioned growth pattern was also evident. These results demonstrate an approach that can provide richer, more actionable information about student status and changes in student performance over the course of the school year.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call