Abstract

Despite growing attention on the Automated Essay Scoring (AES) systems, it is difficult to find their actual use in Korean EFL high schools. Among various free and easily accessible AES systems available, this study aims to investigate the efficacy of three AES systems in the Korean EFL high school context: Grammarly, MS Editor, and ProWritingAid. To this end, the three AES systems were compared with two human raters in three aspects: scores reported on Korean EFL high school students’ writings, errors detected per error category, and errors detected per student writing. The t-test, ANOVA, and correlation analysis results have led to questions and doubts on the validity of the AES programs and their role as essay evaluators. While AES programs have shown growth, the AES programs are not yet comparable to human raters and teacher feedback. However, the AES programs do show potential as a supplementary tool in the Korean EFL context. Thus, a comparison of the three AES systems was examined in the hopes of finding the optimal choice of AES system. The findings provide information and insights on how AES systems may be implemented in Korean EFL high school classrooms, and further emphasized the importance of teachers’ role in implementing a new tool.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call