Abstract

Essays are considered as the most useful tool to assess learning outcomes, guide students’ learning process and to measure their progress. Manual grading of students’ essays is a time-consuming process, but is nevertheless necessary. Automated essay evaluation represents a practical solution to this task, however, its main weakness is the predominant focus on vocabulary and text syntax, and limited consideration of text semantics. In this work, we propose an extension of existing automated essay evaluation systems by incorporating additional semantic coherence and consistency attributes. We design the novel coherence attributes by transforming sequential parts of an essay into the semantic space and measuring changes between them to estimate coherence of the text. The novel consistency attributes detect semantic errors using information extraction and logic reasoning. The resulting system (named SAGE - Semantic Automated Grader for Essays) provides semantic feedback for the writer and achieves significantly higher grading accuracy compared with 9 other state-of-the-art automated essay evaluation systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call