Abstract

This paper is designed to synopsize the efforts of a team of general chemistry teachers to enact assessments during the abrupt transition to online-only instruction and reflect on what was done successfully and what could be improved. The focus is on the extent remote, online assessments accurately measured student knowledge described within the context of the decisions made for administration of the assessments. To limit the extent of unintended student collaboration, exams were given at a set day and time; question banks were developed so that students received variants of similar questions, and a remote proctoring software was used. Correlations showed consistency between online exams and paper exams, alleviating concerns that widespread cheating may take place. A concern was found with reusing the same exam at a later date for students who missed the original exam despite efforts to keep the online exam secure. The generation of alternative exams for later dates is recommended, potentially supported by creating and curating an online repository of assessment items to utilize.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call