Abstract

Providing feedback on students’ writing is considered important by both writing teachers and students. However, contextual constraints including excess workloads and large classes pose major and recurrent challenges for teachers. To lighten the feedback burden, teachers can take advantage of a range of automated feedback tools. This paper investigated how automated feedback can be integrated into traditional teacher feedback by analyzing the focus of teacher and Grammarly feedback through a written feedback analysis of language- and content-related issues. This inquiry considered whether and how successfully students exploited feedback from different sources in their revisions and how the feedback provisions helped improve their writing performance. The study sample of texts was made up of 216 argumentative and narrative essays written by 27 low-intermediate level students at a Myanmar university over a 13-week semester. By analyzing data from the feedback analysis, we found that Grammarly provided feedback on surface-level errors, whereas teacher feedback covered both lower- and higher-level writing concerns, suggesting a potential for integration. The results from the revision analysis and pre- and post-tests suggested that students made effective use of the feedback received, and their writing performance improved according to the assessment criteria. The data were triangulated with self-assessment questionnaires regarding students’ emic perspectives on how useful they found the feedback. The pedagogical implications for integrating automated and teacher feedback are presented.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call