Abstract

Publishing and editorial policies differ substantially across the Radiation Oncology (RO) and Medical Physics (MedPhys) compendium of journals. Adoptance of modern standards in scientific publishing and data sharing have the potential to improve the impact and reliability of the RO literature. We characterized the editorial, authorship and peer reviewer policies of various prominent clinical RO (N = 16) and medical physics (N = 9) peer-reviewed journals affiliated with professional societies for characteristics that are associated with improved reproducibility and rigorous review. A combination of tools including Enhancing the QUAlity and Transparency Of health Research (EQUATOR), Findability, Accessibility, Interoperability, and Reuse (FAIR), and Quality Output Checklist and Content Assessment (QuOCCA) principles were used to quantify the value and reproducibility of journal policies. Cohen's kappa coefficient was utilized to assess agreement between reviewers. Components of the above tools were regressed against various scientometric indices (H-index, IF, etc.) to identify factors that are associated with perceived relative importance within the field. Reviewer agreement (κ) for scientometric indices was highest (1.0) for criteria for statistical review and data submission standards and lowest (-0.246) for various submission checklists. Data availability statements were endorsed (44%) or required (31%) in a higher proportion of RO journals relative to MedPhys journals (44%, 0% respectively). Data repository submission was required in <10% of journals. FAIR adoptance was poor (31%, 22%) in RO and MedPhys journals. ≥1 EQUATOR guideline checklist was endorsed or required in 76% of journals. While there were no glaring differences in editorial policies between RO and MedPhys journals, there was substantial heterogeneity of scientometrics evaluating the rigor of data submission, reproducibility standards, and statistical review criteria. Linear regression of journal impact factors indicated a predictive relationship between FAIR adoption standards, use of EQUATOR checklists, and more rigorous statistical method submission criteria. The present review documented and confirmed significant variation in submission, review, and publication policies across RO and MedPhys journals. Established scientometric standards, FAIR principle adoptance, and more rigorous statistical methodology were predictive of increasing journal impact factor.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call