BackgroundHigh-stakes assessments, such the Graduate Records Examination, have transitioned from paper to computer administration. Low-stakes research-based assessments (RBAs), such as the Force Concept Inventory, have only recently begun this transition to computer administration with online services. These online services can simplify administering, scoring, and interpreting assessments, thereby reducing barriers to instructors’ use of RBAs. By supporting instructors’ objective assessment of the efficacy of their courses, these services can stimulate instructors to transform their courses to improve student outcomes. We investigate the extent to which RBAs administered outside of class with the online Learning About STEM Student Outcomes (LASSO) platform provide equivalent data to tests administered on paper in class, in terms of both student participation and performance. We use an experimental design to investigate the differences between these two assessment conditions with 1310 students in 25 sections of 3 college physics courses spanning 2 semesters.ResultsAnalysis conducted using hierarchical linear models indicates that student performance on low-stakes RBAs is equivalent for online (out-of-class) and paper-and-pencil (in-class) administrations. The models also show differences in participation rates across assessment conditions and student grades, but that instructors can achieve participation rates with online assessments equivalent to paper assessments by offering students credit for participating and by providing multiple reminders to complete the assessment.ConclusionsWe conclude that online out-of-class administration of RBAs can save class and instructor time while providing participation rates and performance results equivalent to in-class paper-and-pencil tests.
Read full abstract