Abstract

In recent years, there has been an increasing use of peer assessment in classrooms and other learning settings. Despite the prevailing view that peer assessment has a positive effect on learning across empirical studies, the results reported are mixed. In this meta-analysis, we synthesised findings based on 134 effect sizes from 58 studies. Compared to students who do not participate in peer assessment, those who participate in peer assessment show a .291 standard deviation unit increase in their performance. Further, we performed a meta-regression analysis to examine the factors that are likely to influence the peer assessment effect. The most critical factor is rater training. When students receive rater training, the effect size of peer assessment is substantially larger than when students do not receive such training. Computer-mediated peer assessment is also associated with greater learning gains than the paper-based peer assessment. A few other variables (such as rating format, rating criteria and frequency of peer assessment) also show noticeable, although not statistically significant, effects. The results of the meta-analysis can be considered by researchers and teachers as a basis for determining how to make effective use of peer assessment as a learning tool.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call