Abstract

During the COVID pandemic, universities around the globe had to move not only their content delivery online, but also their assessments. Due to COVID causing significant upheaval in Higher Education (HE), this enforced experiment also afforded an opportunity to reflect on traditional, invigilated, closed book exams (ICBE) resulting in research and advice in this area. A systematic review of this academic and grey literature was performed concentrating on maths heavy physics examinations to investigate what guidance is given to examination writers, educators who prepare students for exams and HE examinees themselves. The literature review results were divided into: Advice for examiners who need to provide an invigilated open book exam (UOBE), discussions on cheating, advice for students and case studies. It was found that ICBEs were good at examining lower order cognitive skills, e.g. recall and understanding, but higher order skills, such as analysing and synthesising, are better examined with access to a larger range of resources. Guidance on making academic misconduct more difficult also suggested using higher order thinking skills in exam questions as responses to these type of tasks are more individual and getting outside help may be more difficult in a time constrained UOBE. Furthermore, literature encouraged reflection on the motivation for cheating and suggested that overly demanding assessment may encourage students to seek inappropriate help. The advice for students highlighted the need to prepare as thoroughly for a UOBE as they would for a traditional exam. Probably the thrust should change from pure memorization to students preparing their notes so that they can efficiently access their material to locate relevant parts for synthesis during a UOBE. Some of the case studies used statistical methods to investigate comparability of grades between UOBEs and ICBEs and some of the studies found them comparable, so a large shift of results may be due to other factors rather than the exam type. Other studies describe their approach and include stakeholder reflections. The main recommendation to exclude lower cognitive skills can pose a problem for maths heavy exams as they mainly assess how well an examinee has mastered these skills before building on them. However, it seems advisable to climb higher up Bloom’s taxonomy if possible. Also, it may be conceivable to break up exams into shorter sections that require individual uploading before access to the next part is granted to reduce the possibility of outside help. Furthermore, individualised maths type problems could be achievable by using different data sets for a question. Student advice should highlight the differences between UOBEs and ICBEs so that they can prepare appropriately.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call