Abstract

The "replication crisis" describes recent difficulties in replicating studies in various scientific fields, most notably psychology. The available evidence primarily documents replication failures for group research designs. However, we argue that contingencies of publication bias that led to the "replication crisis" also operate on applied behavior analysis (ABA) researchers who use single-case research designs (SCRD). This bias strongly favors publication of SCRD studies that show strong experimental effect, and disfavors publication of studies that show less robust effect. The resulting research literature may unjustifiably inflate confidence about intervention effects, limit researchers' ability to delineate intervention boundary conditions, and diminish the credibility of our science. To counter problems of publication bias in ABA, we recommend that journals that publish SCRD research establish journal standards for publication of noneffect studies; that our research community adopt open sharing of SCRD protocols and data; and that members of our community routinely publish systematic literature reviews that include gray (i.e., unpublished) research.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.