Abstract

The accurate identification of KRAS mutation status on medical images is critical for doctors to specify treatment options for patients with rectal cancer. Deep learning methods have recently been successfully introduced to medical diagnosis and treatment problems, although substantial challenges remain in the computer-aided diagnosis (CAD) due to the lack of large training datasets. In this paper, we propose a multi-branch cross attention model (MBCAM) to separate KRAS mutation cases from wild type cases using limited T2-weighted MRI data. Our model is built on multiple different branches generated based on our existing MRI data, which can take full advantage of the information contained in small data sets. The cross attention block (CA block) is proposed to fuse formerly independent branches to ensure that the model can learn as many common features as possible for preventing the overfitting of the model due to the limited dataset. The inter-branch loss is proposed to constrain the learning range of the model, confirming that the model can learn more general features from multi-branch data. We tested our method on the collected dataset and compared it to four previous works and five popular deep learning models using transfer learning. Our result shows that the MBCAM achieved an accuracy of 88.92% for the prediction of KRAS mutations with an AUC of 95.75%. These results are a significant improvement over those existing methods (p < 0.05).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.