Abstract
This study examined the reliability and validity of Web-based portfolio peer assessment. Participants were 72 second-grade students from a senior high school taking a computer course. The results indicated that: 1) there was a lack of consistency across various student raters on a portfolio, or inter-rater reliability; 2) two-thirds of the raters demonstrated inconsistency assessing different portfolios, i.e. inner-rater reliability; 3) peer-assessment scores were not consistent with teacher-assessment scores (criterion-related validity); 4) significant differences were found between peer-assessment scores and end-of-course examination scores, implying that Web-based portfolio peer assessment failed to reflect learning achievements (criterion-related validity). In short, Web-based portfolio peer assessment was not a reliable and valid method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.