BackgroundThe importance of feasibility studies (also referred to as exploratory or pilot studies) for optimising complex public health interventions and evaluation designs before evaluating effectiveness is widely acknowledged. In a systematic review of guidance on feasibility studies, we found that guidance is lacking or inconsistent on many aspects of their purpose, design, and conduct, and that it is lacking on the evidence needed to inform decisions about when to proceed to an effectiveness study. This work, building on that review, aimed to develop guidance for researchers, peer reviewers, and funders. MethodsThe systematic review was followed by a three-round web-based, Delphi exercise. We identified novel approaches to intervention optimisation and study designs from beyond public health through a scoping review and qualitative interviews with 15 experts in intervention design and evaluation. We discussed key aspects of the draft guidance with evaluation design specialists, funders, and journal editors in a consensus workshop (n=30), and we revised the guidance accordingly. The review is registered with PROSPERO, number CRD42016047843. FindingsOur systematic review had identified 25 unique sets of guidance. The Delphi identified consensus on many aspects of feasibility study methodology, but disagreement on others, including terminology, how feasibility study data can inform decisions about sample size, how progression criteria should be set, and how progression decisions should be made. A number of study designs typically used in clinical studies (eg, n of 1), digital health (eg, A–B testing), and engineering (eg, fractional factorial designs) have the potential to be applied more widely in feasibility studies of complex public health interventions—for example, to optimise interventions or to explore variation in intervention effects. InterpretationThe guidance will help researchers to develop and conduct feasibility studies, and take appropriate decisions on progression to an effectiveness study. It will provide peer reviewers and research funders with objective criteria against which to assess bids and publications. Study limitations include a lower response from less experienced researchers than from more experienced research methodologists in the Delphi exercise. The systematic review of guidance covered seven health-related bibliographic databases but might have missed guidance from other areas of social intervention research. FundingMedical Research Council (MRC)/National Institute of Health Research (NIHR) Methodology Research Panel (MR/N015843/1).
Read full abstract