Abstract

During a two-day objective structured clinical examination (OSCE), we compared two types of checklists for student performance ratings: paper & pencil vs. digital checklists on iPads. Several subjective and objective measures from 10 examiners were collected and computed. Data showed that digital checklists were perceived as significantly more usable and less exertive and were also preferred in overall ratings. Assessments completed with digital checklists were found to have no missing items while assessments completed with paper checklists contained more than 8 blank items on average. Finally, checklist type did not influence assessment scores even though when using digital checklists more item-choice changes were produced.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call