Abstract

The popularity of massive open online courses (MOOCs) and other forms of distance learning has increased recently. Schools and institutions are going online to serve their students better. Exam integrity depends on the effectiveness of proctoring remote online exams. Proctoring services powered by computer vision and artificial intelligence have also gained popularity. Such systems should employ methods to guarantee an impartial examination. This research demonstrates how to create a multi-model computer vision system to identify and prevent abnormal student behaviour during exams. The system uses You only look once (YOLO) models and Dlib facial landmarks to recognize faces, objects, eye, hand, and mouth opening movement, gaze sideways, and use a mobile phone. Our approach offered a model that analyzes student behaviour using a deep neural network model learned from our newly produced dataset" StudentBehavioralDS." On the generated dataset, the "Behavioral Detection Model" had a mean Average Precision (mAP) of 0.87, while the "Mouth Opening Detection Model" and "Person and Objects Detection Model" had accuracies of 0.95 and 0.96, respectively. This work demonstrates good detection accuracy. We conclude that using computer vision and deep learning models trained on a private dataset, our idea provides a range of techniques to spot odd student behaviour during online tests.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call