Abstract

To evaluate the use of reporting checklists and quality scoring tools for self-reporting purposes in radiomics literature. Literature search was conducted in PubMed (date, April 23, 2023). The radiomics literature was sampled at random after a sample size calculation with a priori power analysis. A systematic assessment for self-reporting, including the use of documentation such as completed checklists or quality scoring tools, was conducted in original research papers. These eligible papers underwent independent evaluation by a panel of nine readers, with three readers assigned to each paper. Automatic annotation was used to assist in this process. Then, a detailed item-by-item confirmation analysis was carried out on papers with checklist documentation, with independent evaluation of two readers. The sample size calculation yielded 117 papers. Most of the included papers were retrospective (94%; 110/117), single-center (68%; 80/117), based on their private data (89%; 104/117), and lacked external validation (79%; 93/117). Only seven papers (6%) had at least one self-reported document (Radiomics Quality Score (RQS), Transparent Reporting of a multivariable prediction model for Individual Prognosis Or Diagnosis (TRIPOD), or Checklist for Artificial Intelligence in Medical Imaging (CLAIM)), with a statistically significant binomial test (p < 0.001). Median rate of confirmed items for all three documents was 81% (interquartile range, 6). For quality scoring tools, documented scores were higher than suggested scores, with a mean difference of - 7.2 (standard deviation, 6.8). Radiomic publications often lack self-reported checklists or quality scoring tools. Even when such documents are provided, it is essential to be cautious, as the accuracy of the reported items or scores may be questionable. Current state of radiomic literature reveals a notable absence of self-reporting with documentation and inaccurate reporting practices. This critical observation may serve as a catalyst for motivating the radiomics community to adopt and utilize such tools appropriately, thereby fostering rigor, transparency, and reproducibility of their research, moving the field forward. • In radiomics literature, there has been a notable absence of self-reporting with documentation. • Even if such documents are provided, it is critical to exercise caution because the accuracy of the reported items or scores may be questionable. • Radiomics community needs to be motivated to adopt and appropriately utilize the reporting checklists and quality scoring tools.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call