Abstract

This paper reports on a study involving the design of online peer assessment (PA) activities to support university students’ small-group project-based learning in an introductory course. The study aimed to investigate the influences of different types of PA in terms of the rubric (quantitative ratings), peer feedback (qualitative comments) and hybrid (a combination of the rubric and peer feedback) on students’ project performance, and to explore further students’ perspectives on online PA. The quantitative findings suggested that (a) students in the hybrid condition likely had better project performance than those in the peer feedback condition did, and (b) students in the rubric condition could perform equally well as those in both of the hybrid and peer feedback conditions. The qualitative findings suggested that besides types of assessment, other possible confounding variables that might affect performance included perceived learning benefits, professional assessment, acceptance, and the online PA system.

Highlights

  • Due to the popularity of online learning, online assessment has received much attention, the evaluation of open-response assignments and a wide variety of produced work, including writing, portfolios, presentations, reports, and artifacts

  • This paper reports on a study involving the design of online peer assessment (PA) activities to support university students’ small-group project-based learning in an introductory course

  • 2020, Vol 8, No 2 implied that the hybrid approach was more effective for learners with the inexperience of PA and a basic level of domain knowledge

Read more

Summary

Introduction

Due to the popularity of online learning, online assessment has received much attention, the evaluation of open-response assignments and a wide variety of produced work, including writing, portfolios, presentations, reports, and artifacts. Those take more time and effort for evaluation than multiple-choice questions do. The primary system function refers to the automation of administrative logistics that helps to resolve the above issue related to the problematic aspect of assessment work (for a review, see Rosa, Coutinho, & Flores, 2016). Examples of the administrative logistics include the submission of grading and feedback, anonymizing and random distribution of assessed tasks, making feedback available, and calculation of marks (Mostert & Snowball, 2013)

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call