Abstract

Students’ ability to effectively study for an exam, or to manage their time during an exam, is related to their metacognitive capacity. Prior research has demonstrated the effective use of metacognitive strategies during learning and retrieval is related to content expertise. Students also make judgments of their own learning and of problem difficulty to guide their studying. This study extends prior research by investigating the accuracy of novices’ and experts’ ability to judge problem difficulty across two experiments; here “accuracy” refers to whether or not their judgments of problem difficulty corresponds with actual exam performance in an introductory mechanics physics course. In the first experiment, physics education research (PER) experts judged the difficulty of introductory physics problems and provided the rationales behind their judgments. Findings indicate that experts use a number of different problem features to make predictions of problem difficulty. While experts are relatively accurate in judging problem difficulty, their content expertise may interfere with their ability to predict student performance on some question types. In the second experiment novices and “near experts” (graduate TAs) judged which question from a problem pair (taken from a real exam) was more difficult. The results indicate that judgments of problem difficulty are more accurate for those with greater content expertise, suggesting that the ability to predict problem difficulty is a trait of expertise which develops with experience.Received 1 June 2015DOI:https://doi.org/10.1103/PhysRevSTPER.11.020128This article is available under the terms of the Creative Commons Attribution 3.0 License. Further distribution of this work must maintain attribution to the author(s) and the published article’s title, journal citation, and DOI.Published by the American Physical Society

Highlights

  • The method that a student utilizes when preparing for an exam, or while taking an exam, is influenced by the students’ metacognitive abilities

  • This study extends prior research by investigating the accuracy of novices’ and experts’ ability to judge problem difficulty across two experiments; here “accuracy” refers to whether or not their judgments of problem difficulty corresponds with actual exam performance in an introductory mechanics physics course

  • The results indicate that judgments of problem difficulty are more accurate for those with greater content expertise, suggesting that the ability to predict problem difficulty is a trait of expertise which develops with experience

Read more

Summary

INTRODUCTION

The method that a student utilizes when preparing for an exam, or while taking an exam, is influenced by the students’ metacognitive abilities. This strategy is generally availing as individuals who spend more time studying problems they judge to be more difficult tend to outperform those who study items in other predetermined sequences [10,11] This advantage only holds for those students whose judgments of problem difficulty were aligned with normative measures of difficulty [12]. When individuals know the solution to a problem, they are no longer able to use subjective experience effectively to make predictions of difficulty Rather they must rely on analytic alternatives, such as implicit (or explicit in some cases) theories about what makes problems within the domain difficult. We could explore how accurate students’ judgments were, which likely impacts both their ability to select problems to practice on while studying for an exam and their ability to pace themselves optimally during an exam

Method
Results
Types of rationales given by experts
Successful and unsuccessful types of þ rationales used by experts
What other factors impact expert performance?
Discussion
EXPERIMENT 2
Findings
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call