Abstract

PurposeArtificial intelligence (AI) is rapidly reshaping how radiology is practiced. Its susceptibility to biases, however, is a primary concern as more AI algorithms become available for widespread use. So far, there has been limited evaluation of how sociodemographic variables are reported in radiology AI research. This study aims to evaluate the presence and extent of sociodemographic reporting in human subjects radiology AI original research. MethodsAll human subjects original radiology AI articles published from January to December 2020 in the top six US radiology journals, as determined by impact factor, were reviewed. Reporting of any sociodemographic variables (age, gender, and race or ethnicity) as well as any sociodemographic-based results were extracted. ResultsOf the 160 included articles, 54% reported at least one sociodemographic variable, 53% reported age, 47% gender, and 4% race or ethnicity. Six percent reported any sociodemographic-based results. There was significant variation in reporting of at least one sociodemographic variable by journal, ranging from 33% to 100%. ConclusionsReporting of sociodemographic variables in human subjects original radiology AI research remains poor, putting the results and subsequent algorithms at increased risk of biases.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.