Abstract
As the COVID-19 pandemic motivated a shift to virtual teaching, exams have increasingly moved online, too. Even though a range of proctoring software providers offer to uphold students' academic integrity, precluding cheating is not easy when tech-savvy students take online exams at home and on their own devices. Online at-home exams tend to be, by nature, open-book exams and tempt students to collude and share materials and answers. However, whilst limiting the opportunity for real-time proctoring, online exams' digital output enables computer-aided detection of collusion. This paper presents two simple data-driven techniques to analyze exam event-logs and essay-form answers. Based on examples from exams in social sciences, we show that such analyses can reveal patterns of student collusion. We suggest using these patterns as evidence for cheating and to quantify the degree of collusion. Finally, we summarize a set of lessons learned about designing and analyzing online exams.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have