Abstract

The Automated Writing Evaluation (AWE) system has garnered growing attention in recent years. It is a powerful complement to traditional teacher feedback. Drawing on the data collected from student written texts, AWE feedback, teacher feedback and student interviews in a Chinese university, this study aims to investigates the impact of man-machine feedback on the practice of teaching English writing in an online English as a foreign language (EFL) context as well as students’ engagement with it. The results show that compared with teacher, AWE provides more feedback items and focuses on surface-level such as vocabulary, mechanics and grammar, while teacher pays attention to both surface-level and meaning-level, including organization, content, coherence, etc. Although there is not much difference in the uptake rate between teacher feedback and AWE feedback, the uptake rate of meaning-level feedback is much higher than surface-level feedback. Furthermore, students believe that uptake rate is influenced by various factors such as English competence, feedback quality, score, available time, personal preference, motivation, interest in English learning, and so on. Additionally, students deem that the combination of AWE feedback and teacher feedback has a positive impact on stimulating their writing enthusiasm and improving their writing abilities.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.