Abstract

PurposeArtificial intelligence (AI) is rapidly reshaping how radiology is practiced. Its susceptibility to biases, however, is a primary concern as more AI algorithms become available for widespread use. So far, there has been limited evaluation of how sociodemographic variables are reported in radiology AI research. This study aims to evaluate the presence and extent of sociodemographic reporting in human subjects radiology AI original research. MethodsAll human subjects original radiology AI articles published from January to December 2020 in the top six US radiology journals, as determined by impact factor, were reviewed. Reporting of any sociodemographic variables (age, gender, and race or ethnicity) as well as any sociodemographic-based results were extracted. ResultsOf the 160 included articles, 54% reported at least one sociodemographic variable, 53% reported age, 47% gender, and 4% race or ethnicity. Six percent reported any sociodemographic-based results. There was significant variation in reporting of at least one sociodemographic variable by journal, ranging from 33% to 100%. ConclusionsReporting of sociodemographic variables in human subjects original radiology AI research remains poor, putting the results and subsequent algorithms at increased risk of biases.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call