Abstract

ObjectiveNeoadjuvant chemotherapy (NAC) is a key element of treatment for locally advanced breast cancer (LABC). Predicting the response to NAC for patients with Locally Advanced Breast Cancer (LABC) before treatment initiation could be beneficial to optimize therapy, ensuring the administration of effective treatments. The objective of the work here was to develop a predictive model to predict tumor response to NAC for LABC using deep learning networks and computed tomography (CT).Materials and methodsSeveral deep learning approaches were investigated including ViT transformer and VGG16, VGG19, ResNet-50, Res-Net-101, Res-Net-152, InceptionV3 and Xception transfer learning networks. These deep learning networks were applied on CT images to assess the response to NAC. Performance was evaluated based on balanced_accuracy, accuracy, sensitivity and specificity classification metrics. A ViT transformer was applied to utilize the attention mechanism in order to increase the weight of important part image which leads to better discrimination between classes.ResultsAmongst the 117 LABC patients studied, 82 (70%) had clinical-pathological response and 35 (30%) had no response to NAC. The ViT transformer obtained the best performance range (accuracy = 71 ± 3% to accuracy = 77 ± 4%, specificity = 86 ± 6% to specificity = 76 ± 3%, sensitivity = 56 ± 4% to sensitivity = 52 ± 4%, and balanced_accuracy=69 ± 3% to balanced_accuracy=69 ± 3%) depending on the split ratio of train-data and test-data. Xception network obtained the second best results (accuracy = 72 ± 4% to accuracy = 65 ± 4, specificity = 81 ± 6% to specificity = 73 ± 3%, sensitivity = 55 ± 4% to sensitivity = 52 ± 5%, and balanced_accuracy = 66 ± 5% to balanced_accuracy = 60 ± 4%). The worst results were obtained using VGG-16 transfer learning network.ConclusionDeep learning networks in conjunction with CT imaging are able to predict the tumor response to NAC for patients with LABC prior to start. A ViT transformer could obtain the best performance, which demonstrated the importance of attention mechanism.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call