Abstract

In this paper, we report on a study to quantify the impact on student learning and on student assessment literacy of a brief assessment literacy intervention. We first define ‘assessment literacy’ then report on the development and validation of an assessment literacy measurement instrument. Using a pseudo-experimental design, we quantified the impact of an assessment literacy-building intervention on students’ assessment literacy levels and on their subsequent performance on an assessment task. The intervention involved students in the experimental condition analysing, discussing and applying an assessment rubric to actual examples of student work that exemplified extremes of standards of performance on the task (e.g. poor, excellent). Results showed that such a procedure could be expected to impact positively on assessment literacy levels and on student performance (on a similar or related task). Regression analyses indicated that the greatest predictor of enhanced student marks (on the assessment task that was the subject of the experiment), was the development of their ability to judge standards of performance on student work created in response to a similar task. The intervention took just 50 minutes indicating a good educational return on the pedagogical investment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call