Abstract

INTRODUCTIONStudent self‐assessment using computer‐based quizzes has been shown to increase subject memory and engagement. Some types of self‐assessment quiz can be associated with a dilemma between 1) medical students who want the self‐assessment quiz to be clearly related to upcoming summative assessments or curated by the exam‐setters, and 2) university administrators and ethics committees who want clear guarantees that the self‐assessment quizzes are not based on the summative assessments or made by instructors familiar with the exam bank of items. One possible way past this apparent ethical impasse was to use a computer to design the self‐assessment questions, with the input to the algorithm being information that was freely available to the students in the course learning materials (i.e. criterion‐referenced from learning outcomes). Here we describe how a computer can formulate large numbers of these basic, rote‐memorization, MCQs; we further add a brief description of whether such questions were found to be satisfactory by students, and whether students engaged with our projectMETHODSAn algorithm in Matlab was developed to formulate multiple choice questions for both ion transport proteins and pharmacology. A schematic of the Matlab process is shown in Figure 1. Matlab scripts and sample materials have been made available online for free (https://github.com/harry‐witchel/Ion‐Channel‐MCQs and https://github.com/harrywitchel/Pharmacology‐MCQs). Instructor‐verified question/items were uploaded to the Synap.ac online self‐quiz web platform, and 48 year 1 United Kingdom undergraduate medical students engaged with it for 3 weeks. Anonymized engagement statistics for students were provided by the Synap platform, and a paper‐based exit questionnaire with an 80% response rate (n = 44) measured satisfaction.RESULTSFour times as many students primarily accessed the quiz system via laptop compared to phone/tablet (Figure 2). Of 391 online questions/items, over 11,749 attempts were made. Data of usage showed that self‐assessment usage peaks occurred in days immediately before summative assessments and that troughs occurred on the days of assessments. Numerical subjective responses to the exit questionnaire were in a Likert format (1 = strongly disagree, 2 = disagree, 3 = neither agree nor disagree, 4 = agree, 5 = strongly agree). Greater than 80% of respondents agreed or strongly agreed with each of the positive statements ("I found the quiz software enjoyable to use", "I found it easy to use", "I engaged more with the study material because of the online quizzes", "I felt I learned more by using these quizzes"). The most positively answered numerical question was "I would like these kinds of quizzes to be extended to other modules", where 77% of the respondents strongly agreed. Subjective responses are shown in Figure 3.CONCLUSIONSDespite simplistic questions and rote‐memorization, the questions developed by this system were engaged with and received positively. Students strongly supported extending the system.Support or Funding InformationBrighton and Sussex Medical School Independent Research Project ProgrammeThis abstract is from the Experimental Biology 2019 Meeting. There is no full text article associated with this abstract published in The FASEB Journal.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call