Abstract
Assessing scientists using exploitable metrics can lead to the degradation of research methods even without any strategic behaviour on the part of individuals, via ‘the natural selection of bad science.’ Institutional incentives to maximize metrics like publication quantity and impact drive this dynamic. Removing these incentives is necessary, but institutional change is slow. However, recent developments suggest possible solutions with more rapid onsets. These include what we call open science improvements, which can reduce publication bias and improve the efficacy of peer review. In addition, there have been increasing calls for funders to move away from prestige- or innovation-based approaches in favour of lotteries. We investigated whether such changes are likely to improve the reproducibility of science even in the presence of persistent incentives for publication quantity through computational modelling. We found that modified lotteries, which allocate funding randomly among proposals that pass a threshold for methodological rigour, effectively reduce the rate of false discoveries, particularly when paired with open science improvements that increase the publication of negative results and improve the quality of peer review. In the absence of funding that targets rigour, open science improvements can still reduce false discoveries in the published literature but are less likely to improve the overall culture of research practices that underlie those publications.
Highlights
The ‘natural selection of bad science’ refers to the degradation of research methodology that results from the hiring and& 2019 The Authors
If negative results are publishable and worthy of prestige, the question is: How common or prestigious must the publication of negative results be, relative to positive results, in order to mitigate the natural selection of bad science?
We first observed the impact of three pure funding strategies (PH, random allocation (RA) and methodological integrity (MI)) in the absence of open science improvements ( p 1⁄4 0, r 1⁄4 0)
Summary
The ‘natural selection of bad science’ refers to the degradation of research methodology that results from the hiring and. The essential task of changing the norms and institutions regarding hiring, promotion, and publication is likely to be difficult and slow-going. There is an increasing number of journals with mandates to accept all well-done studies regardless of the perceived importance of the results, thereby reducing publication bias These include PLoS ONE, Collabra, PeerJ, and Royal Society Open Science. We refer to the union of open science and funding improvements as rapid institutional changes to denote that they are can be implemented quickly, at least relative to the time scale needed to remove the emphasis on quantitative metrics at the hiring and promotion stages. We review the proposed changes we examine, the rationale behind these changes, and the key questions regarding their consequences
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.