Abstract

The Automated Writing Evaluation (AWE) system has garnered growing attention in recent years. It is a powerful complement to traditional teacher feedback. Drawing on the data collected from student written texts, AWE feedback, teacher feedback and student interviews in a Chinese university, this study aims to investigates the impact of man-machine feedback on the practice of teaching English writing in an online English as a foreign language (EFL) context as well as students’ engagement with it. The results show that compared with teacher, AWE provides more feedback items and focuses on surface-level such as vocabulary, mechanics and grammar, while teacher pays attention to both surface-level and meaning-level, including organization, content, coherence, etc. Although there is not much difference in the uptake rate between teacher feedback and AWE feedback, the uptake rate of meaning-level feedback is much higher than surface-level feedback. Furthermore, students believe that uptake rate is influenced by various factors such as English competence, feedback quality, score, available time, personal preference, motivation, interest in English learning, and so on. Additionally, students deem that the combination of AWE feedback and teacher feedback has a positive impact on stimulating their writing enthusiasm and improving their writing abilities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call