Abstract

The interpretability of artificial intelligence (AI) based medical diagnostic systems is crucial to make the diagnosis adequately convincible. Deep learning has been extensively investigated and utilized in the area of medical assistance diagnosis in recent decades due to its outstanding performance and objective prediction. However, a huge semantic chasm dividing clinicians and unexplainable deep models emerges. Here we design a brain-inspired inference framework from medical images to explainable features, then to the final diagnostic conclusions. The fast thinking module is responsible for recognizing medical features in ultrasound (US) images, and the slow-thinking module builds a model for inferring from medical features to diagnostic results by constructing a knowledge graph of medical features with tensor decomposition. The whole model infers through intuition and thinking like a human being, and gives the recognized medical image features while inferring the diagnosis, which greatly improves the interpretability of the model. We conducted studies on thyroid cancer diagnoses using US images. The American College of Radiology (ACR) Thyroid Imaging Reporting and Data System (TI-RADS) characteristics are employed as medical features describing thyroid nodules. Our brain-inspired medical inference framework outperforms commonly used deep learning algorithms, with an AUC score of 0.963 (95% confidential interval (CI)=0.923–1.000) for thyroid US image diagnosis. Results indicate that our framework improves diagnostic objectivity and interpretability while providing performance that is better than deep models. Our proposed brain-inspired medical inference framework could improve the efficiency of diagnosis and our technique is performant, objective and interpretable.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.