Abstract

Assessing student’s essay writing and providing thoughtful feedback is a truly labor-intensive and time-consuming task. With human instructors already overwhelmed, the alternate is to consider a computer-based grading. Recent advances have generated renewed interest in automatic evaluation of essays (AEE). The AEEs instantaneous feedback and more consistent grading helps students draft better essays. This work presents a system to automatically grade the school children essays in Arabic, calling it AAEE for “automatic Arabic essays evaluator”. The system is modeled upon the scoring scheme followed by the school instructors in Saudi Arabia. The instructors had specific criteria upon which an essay is assessed. Putting these criteria together we developed a system that relies on Latent Semantic Analysis, and Rhetorical Structure Theory. With this design we are able to assess individual components of the essay such as language proficiency, structure of the essay etc. To test the system, we collected essays by local school children covering grades 7–12. A total of 350 different handwritten essays—spanning eight different topics—each transcribed into computer readable format. The AAEE shows that 90% of the test essays were correctly scored, and a correlation of 0.756 between automatic and teachers’ scoring. This exceeds the human-human correlation of 0.709 for the Arabic essays.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.