Abstract
Scientific research isoftena resultofa sparkofcreativity.However, formal scientific reporting and creativewritinghavediffering goals that require different approaches. Thepurpose of a report of a research study is to communicate the design, execution, and findingsof the study with precision and accuracy. From the perspective of a journal editor, the writing should be invisible to an informed reader, without byzantine phrasing or ambiguity of meaning. If someone needs to read asentencemultiple times tounderstand it, theauthorsandeditors have failed. Thescientific report shouldbeconcisebut shouldalsoprovide transparency and present all of the key information required for researchers, clinicians, andother readers to be able to assess the validity of the study and its findings. Recognizing deficiencies in the quality of reports of randomized clinical trials (RCTs), agroupmet in 1993 toestablish reportingstandards for these studies,with the subsequentpublicationof the Standards of Reporting Trials (SORT) proposed checklist in 1994,1 followed by the Consolidated Standards of Reporting Trials (CONSORT) statement in 1996.2 Researchers and editors recognized thevalueofusing elements of these standardized approaches, and the CONSORT guidelines becamemore widely embraced and have become de facto standards for reporting RCTs. Since then, guidelines have been produced for virtually the entire range of research design, including observational studies (STROBE), diagnostic test assessment (STARD), systematic reviews andmeta-analysis (PRISMAand MOOSE), tumormarker studies (REMARK), cost-effectiveness analysis (CHEERS), and preclinical animal studies (ARRIVE), and some, like CONSORT, have been updated. The EQUATOR network was formed to help with the development and dissemination of these guidelines and currently includes 256 reporting guidelines on its website.3 Most reporting guidelines include checklists for authors to provide journals at the time of submission; they are often copublishedwith practical examples and sometimes a standalone explanation andelaborationdocument.While thehope is that adherence to these guidelines will result in a published article that is precise and therefore allows readers to make informed judgments, at the least it helps editors andexternal reviewers conduct effective peer review by increasing the likelihood that critical information is included in the submitted manuscript. Recognizing the potential value of such guidelines for the peer review and scientific publication processes, JAMA published several of the initial reporting guidelines, including CONSORT2 and MOOSE.4 While the proliferation of guidelines means that many deal with fairly narrow niches, JAMA continues to be interested in publishing guidelines that address common or emerging and important study designs. Recent examples include extensions to the CONSORT guidelines for noninferiority and equivalence designs5 and for patient-reported outcomes6; both of these reflect study designs of increasing prevalence in the clinical literature and hence progressively more important for researchers and clinicians. In this issue of JAMA, Stewart and colleagues7 provide an extension of the PRISMA guidelines for individual participant data (IPD) meta-analyses: the PRISMA-IPD Statement. Modifications of PRISMA relate to structural elements such as the abstract, but also to issues particular to this study design, such as methods of obtaining the individual participant data, exploring data integrity, handling trials for which individual data were unavailable, and methods for data synthesis. At present, the IPD meta-analysis design represents a small percentage of all systematic reviews and metaanalyses. However, the increasing interest in data sharing8,9 is likely to lead to a proliferation of IPD meta-analyses, so this study design will be increasingly relevant to authors and readers. While meta-analyses based on either aggregate or individual participant data are susceptible to important and sometimes critical limitations,10 having standards for their reporting will help ensure clarity and transparency around these limitations so that readers’ interpretations are fully informed. Even though reporting guidelines like PRISMA-IPD provide consensus-based expert opinion for communicating the methods and results of studies, these recommendations are guidelines, not rules, and journals may selectively not adhere to all points. For instance, JAMA does not ordinarily differentiate subtypes of meta-analyses in the article title or subtitle, as recommended by the PRISMA-IPD statement (checklist, item 1). Also, because of the uncertainty that may arise when combining data from RCTs from various sources (such as differences in the patient populations, effectiveness of randomization, application of interventions, and assessment of outcomes), the ability to draw valid causal inferences, even from pooled individual participant data, may be limited. Accordingly, JAMA considers meta-analysis to represent an observational design, such that outcomes, inferences, and interpretations should be described as associations rather than reported using causal terms such as Related article page 1657 Opinion
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have