Abstract

Conceptualising peer assessment as a dialogue is expected to bring many benefits for learners. However, open-ended dialogue still faces many challenges in learning effectiveness because superficial feedback contents, resistance to critical thinking and distrust of peers’ assessments may negatively affect learners’ performance in their interactions. This study created regulation scripts for dialogic peer assessment, which not only engaged learners in meaningful dialogues through prompt questions, but also guided them to establish shared goals and monitor their collaboration processes. To verify their effectiveness, a quasi-experimental study was performed with 34 undergraduate students in an authentic learning environment. Learners in the experimental groups were required to exchange feedback information in stages involving case evaluation, group dialogic feedback and individual reflection with the support of regulation scripts, while learners in the control groups completed the same experimental process with general scripts. The quantitative data showed that the experimental groups achieved better development in climate of trust and critical thinking. In terms of feedback quality, however, there was no significant difference between the two groups.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call