Many 4-year public institutions face significant pedagogical challenges due to the high ratio of students to teaching team members. To address the issue, we developed a workflow using the programming language R as a method to rapidly grade multiple-choice questions, adjust for errors, and grade answer-dependent style multiple-choice questions, thus shifting the teaching teams' time commitment back to student interaction. We provide an example of answer-dependent style multiple-choice questions and demonstrate how the output allows for discrete analysis of questions based on various categories such as Fundamental Statements or Bloom's Taxonomy Levels. Additionally, we show how student demographics can be easily integrated to yield a holistic perspective on student performance in a course. The workflow offers dynamic grading opportunities for multiple-choice questions and versatility through its adaptability to assessment analyses. This approach to multiple-choice questions allows instructors to pinpoint factors affecting student performance and respond to changes to foster a healthy learning environment.
Read full abstract