Abstract
Around 60-80% of radiological errors are attributed to overlooked abnormalities, the rate of which increases at the end of work shifts. In this study, we run an experiment to investigate if artificial intelligence (AI) can assist in detecting radiologists' gaze patterns that correlate with fatigue. A retrospective database of lung X-ray images with the reference diagnoses was used. The X-ray images were acquired from 400 subjects with a mean age of 49 ± 17, and 61% men. Four practicing radiologists read these images while their eye movements were recorded. The radiologists passed a series of concentration tests at prearranged breaks of the experiment. A U-Net neural network was adapted to annotate lung anatomy on X-rays and calculate coverage and information gain features from the radiologists' eye movements over lung fields. The lung coverage, information gain, and eye tracker-based features were compared with the cumulative work done (CDW) label for each radiologist. The gaze-traveled distance, X-ray coverage, and lung coverage statistically significantly (p < 0.01) deteriorated with cumulative work done (CWD) for three out of four radiologists. The reading time and information gain over lungs statistically significantly deteriorated for all four radiologists. We discovered a novel AI-based metric blending reading time, speed, and organ coverage, which can be used to predict changes in the fatigue-related image reading patterns.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.