Abstract

Duhem’s problem arises especially in scientific contexts where the tools and procedures of measurement and analysis are numerous and complex. Several philosophers of cognitive science have cited its manifestations in fMRI as grounds for skepticism regarding the epistemic value of neuroimaging. To address these Duhemian arguments for skepticism, I offer an alternative approach based on Deborah Mayo’s error-statistical account in which Duhem's problem is more fruitfully approached in terms of error probabilities. This is illustrated in examples such as the use of probabilistic brain atlases, comparison of different preprocessing protocols with respect to their error characteristics, and statistical modeling of fMRI data. These examples demonstrate the ways in which we can better understand and formulate the general methodological problem and direct the way toward alternative approaches to neuroimaging in philosophy of cognitive science, in which we can be more balanced and productive in our scrutiny of the epistemic value of neuroimaging studies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call