Abstract

Although it is well established that our thinking can often be biased, the precise cognitive mechanisms underlying these biases are still debated. The present study builds on recent research showing that biased reasoners often seem aware that their reasoning is incorrect; they show signs of conflict detection. One important shortcoming in this research is that the conflict detection effect has only been studied with classic problem-solving tasks, requiring people to make a decision themselves. However, in many reasoning situations people are confronted with decisions already made by others. Therefore, the present study (N = 159) investigated whether conflict detection occurs not only during reasoning on problem-solving tasks (i.e., decision-making), but also on vignette tasks, requiring participants to evaluate decisions made by others. We analyzed participants' conflict detection sensitivity on confidence and response time measures. Results showed that conflict detection occurred during both decision-making and decision-evaluation, as indicated by a decreased confidence. The response time index appeared to be a less reliable measure of conflict detection on the novel tasks. These findings are very relevant for studying reasoning in contexts in which recognizing reasoning errors is important; for instance, in education where teachers have to give feedback on students' reasoning.

Highlights

  • Every day, people make countless decisions, and the vast majority is made effortlessly, without deliberate thought

  • This pattern applied to both bias tasks and to both task formats

  • On the other hand, results showed that participants performed conflict tasks significantly better in vignette format than in problem-solving format

Read more

Summary

Introduction

People make countless decisions, and the vast majority is made effortlessly, without deliberate thought. This is highly adaptive because we would be exhausted if we had to think through each and every decision and, it usually yields good decisions. It can lead to biases in reasoning (Kahneman, 2011; Stanovich et al, 2016). Biases are systematic errors in people’s thinking and violate the normative rules of rationality as set for instance by logic or probability (Stanovich et al, 2016; Tversky & Kahneman, 1974). Stan is a randomly chosen participant of the study.

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call