Abstract

Conceptual inventory surveys are routinely used in education research to identify student learning needs and assess instructional practices. Students might not fully engage with these instruments because of the low stakes attached to them. This paper explores tests that can be used to estimate the percentage of students in a population who might not have taken such surveys seriously. These three seriousness tests are the pattern recognition test, the easy questions test, and the uncommon answers test. These three tests are applied to sets of students who were assessed either by the Force Concept Inventory, the Conceptual Survey of Electricity and Magnetism, or the Brief Electricity and Magnetism Assessment. The results of our investigation are compared to computer simulated populations of random answers.Received 15 May 2019DOI:https://doi.org/10.1103/PhysRevPhysEducRes.15.020118Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI.Published by the American Physical SocietyPhysics Subject Headings (PhySH)Research AreasAssessmentResearch methodologyPhysics Education Research

Highlights

  • Conceptual inventories (CIs) came out of our necessity to quantify students’ understanding of concepts and their progress in class by monitoring learning gains [1]

  • We developed three seriousness tests that can be applied to Force Concept Inventory (FCI), Conceptual Survey of Electricity and Magnetism (CSEM), and Brief Electricity and Magnetism Assessment (BEMA) responses in order to estimate the percent of students in a sample who did not take that research-based assessment instruments (RBAIs) seriously: the pattern recognition test (PRT), the uncommon answers test (UAT), and the easy questions test (EQT)

  • Our results are in contrast to work mentioned in the introduction by Henderson, who found that about 2.8% of students did not take the FCI seriously [32], and the results from Pollock et al who found that 3% indicated that they did not take the BEMA seriously [42]

Read more

Summary

INTRODUCTION

Conceptual inventories (CIs) came out of our necessity to quantify students’ understanding of concepts and their progress in class by monitoring learning gains [1]. Halloun and Hestenes raised the concern that traditional instruction marginally affects students’ understanding while their common sense beliefs usually contradict the laws of physics [3,4] Their Force Concept Inventory (FCI) survey arrives as a first tool to measure students’ mastery of force concepts widely taught in the first semester of physics [5]. When RBAI data are collected regularly, they could be valuable measuring tools by providing standardized comparisons among institutions, instructors, and teaching methods, and over multiple implementations of the same course. They allow us to track trends and investigate correlations over time [25,26]. The BEMA is a 31-question RBAI designed to assess conceptual understanding of electromagnetism

Main concerns with RBAIs
THE SERIOUSNESS TESTS
The pattern recognition test
The uncommon answers test
The easy questions test
Combining the results to determine the overall percent of nonserious students
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call