Abstract

Active learning requires that students receive a continuous feedback about their understanding. Multiple-Choice Questions (MCQ) tests have been frequently used to provide students the required feedback and to measure the effectiveness of this learning model. To construct a test is a challenging task, which is time consuming and requires experience. For these reasons, research efforts have been focused on the automatic generation of well-constructed tests. The semantic technologies have played a relevant role in the implementation of these testgeneration systems. Nevertheless, the existing proposals present a set of drawbacks that restrict their applicability to different learning domains and the type of test to be composed. In this paper, we propose a service-oriented and semantic-based system that solves these drawbacks. The system consists of a dynamic strategy of generating candidate distractors (alternatives to the correct answer), a set of heuristics for scoring the distractors' suitability, and a selection of distractors that considers the difficulty level of tests. Besides, the final version of tests is created using the Google Form service, a de-facto standard for elaborating online questionnaires.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call