Abstract

None

Highlights

  • A decade ago, a larger meta-analysis revealed that peer marks tend to agree well with teacher marks, in particular if a global judgment was made and if it was based on well-understood assessment criteria [3]

  • For example, been reported that even when a scoring rubric is co-created with students, teachers cannot expect that students know how to apply it independently

  • For example, commented on the use of scoring rubrics used in peer assessment [2]: Students are not always good at peer- and self-assessment at first, even with a rubric in hand

Read more

Summary

Introduction

A decade ago, a larger meta-analysis revealed that peer marks tend to agree well with teacher marks, in particular if a global judgment was made and if it was based on well-understood assessment criteria [3]. Others have pointed out that a discrepancy between student and teacher understanding of the assessment criteria would give rise to inconsistencies in their marking when compared, and that the lack of sufficiently broad background knowledge (expertise) on the topic to be assessed, such as for example not having read the same references as the peers being assessed, can be a perceived challenge for peer markers [4].

Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.