Abstract

This article was migrated. The article was marked as recommended. Introduction Writing and answering multiple choice questions (MCQs) is a learning activity that potentially engages deep learning. We conducted three year-long case studies of MCQ writing and answering in PeerWise to engage students in learning Pathology. Methods Overall, an instrumental case-study design with the structure of sequential multiple case studies was used. Across three years fourth year medical students were required to write and answer MCQs. In 2016 students were provided with advice for writing questions and were encouraged to adhere to Bloom's taxonomy. In 2017, to reduce cognitive load, students were provided with a MCQ template and allocated topics. In 2018, to encourage engagement, students were informed that the top forty MCQs would be in the final exam. Results An evaluation survey was used to measure each student's perception of the MCQ exercise. In 2016 most students had a negative opinion of the MCQ exercise. Students found writing MCQs too time consuming and demanding. In 2017 student's attitudes to the MCQ exercise were more positive. In 2018 there were insufficient responses to the survey but informal student feedback suggested the MCQ exercise was considered an inefficient use of student study time. There were minimal changes in student's activity levels from 2016 to 2017. However, in 2018 when students were informed that the top forty MCQs generated would be included in their final exam they answered a greater number of MCQs than in previous years. Conclusions Providing students with templates and assigning topics for MCQs may improve student attitudes toward MCQ writing and including student generated MCQs in the final exam encourages students to answer more MCQs. However, due to high demands on their time, medical students' prioritised efficiency and MCQ writing may not be an efficient strategy for deep learning.

Highlights

  • Writing and answering multiple choice questions (MCQs) is a learning activity that potentially engages deep learning

  • In 2018 there were insufficient responses to the survey but informal student feedback suggested the MCQ exercise was considered an inefficient use of student study time

  • In 2018 when students were informed that the top forty MCQs generated would be included in their final exam they answered a greater number of MCQs than in previous years

Read more

Summary

Introduction

Writing and answering multiple choice questions (MCQs) is a learning activity that potentially engages deep learning. We conducted three year-long case studies of MCQ writing and answering in PeerWise to engage students in learning Pathology. MCQ writing for question banks has been reported as a small group learning activity for medical students, who report this as an enjoyable and useful learning activity (Gooi and Sfommerfeld, 2015; Harris et al, 2015; Lawrence, Famokunwa and Ziff, 2016). Our pilot of an MCQ authoring and answering exercise had variable student acceptance (Grainger et al, 2018). We report three consecutive yearlong case studies of a student-generated MCQ learning exercise that aimed to enhance deep learning and knowledge construction in a 4th year pathology course and reflect on the lessons learnt

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call