Abstract

This study investigates the effects of using an online Automated Essay Assessing (AEA) system on EFL graduate students’ writing. Eighty four EFL graduate students divided into the treatment group and the control group participated in this study. The treatment group was asked to use an AEA system to assist their essay writing. Both groups were pretested before the treatment. After the sixteen-week treatment period, the writing posttest was given to both groups to measure their progress in their writing. The results indicate that, although both groups have no significant difference in the pretest, the difference between their mean scores on the posttest is statistically significant. The mean score of the treatment group is higher than that of the control group, and students’ use frequency of the AEA system is closely related to their writing improvement. Therefore, it is concluded that, students’ using the AEA system contributes significantly to their writing improvement. Thus a linear regression is modeled to predict the weight of the AEA system use frequency to proficiency improvement in writing. Finally, students’ perceptions based on their learning journals and interviews show the changes in the process of writing.

Highlights

  • E-assessment applied to writing is Automated Essay Scoring (AES) and Automated Essay Assessing (AEA)

  • This study investigates the effects of using an online Automated Essay Assessing (AEA) system on EFL graduate students’ writing

  • Eighty four EFL graduate students divided into the treatment group and the control group participated in this study

Read more

Summary

Introduction

E-assessment applied to writing is AES and AEA. According to researchers (Rudner & Liang, 2002; Dikli, 2006), AES is defined as the computer technology that evaluates and scores the written prose so as to overcome time, cost, reliability, and generalizability issues in writing assessment. The online AEA system (http://www.pigai.org/) falls into the category of the third stage It is based on the large-scale corpus to compare the uploaded essays. It checks whether the collocation and the chunks are authentic to spot out the typical Chinglish which is ungrammatical or nonsensical English influenced by the Chinese language in Chinese contexts. It takes advantages of cloud computing, which timely responds to student’s essay uploaded by scoring, reviewing the whole essay as well as analyzing each sentence automatically. (3) Pointing out mistakes: those mistakes range from spelling mistakes, to word usage mistakes, even to collocating mistakes

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.