Abstract Purpose Many low-middle income countries (LMIC) suffer from chronic shortages of resources that inhibit the implementation of effective breast cancer screening programs. Advanced breast cancer rates in the U.S. Affiliated Pacific Islands substantially exceed that of the United States. We propose the use of portable breast ultrasound coupled with artificial intelligence (AI) algorithms to assist non-radiologist field personnel in real-time field lesion detection, classification, and biopsy, as well as determination of breast density for risk assessment. In this study, we examine the ability of an AI algorithm to detect and describe breast cancer lesions in clinically-acquired breast ultrasound images in 40,000 women participating in a Hawaii screening program. Materials and Methods The Hawaii and Pacific Islands Mammography Registry (HIPIMR) collects breast health questionnaires and breast imaging (mammography, ultrasound, and MRI) from participating centers in Hawaii and the Pacific and links this information to the Hawaii Tumor Registry for cancer findings. From the women with either screening or diagnostic B-mode breast ultrasound exams, we selected 3 negative cases (no cancer) for every positive case matched by age, excluding Doppler and elastography images. The blinded images were read by the study radiologist to delineate all lesions and describe in terms of the BI-RADS lexicon. These images were split by woman into training (70%), validation and hyperparameter selection (20%) and testing (20%) subsets. An AI model was fine-tuned for lesion and BI-RADS category classification from a Detectron2 framework [1] pre-trained on the COCO Instance Segmentation Dataset [2]. Model performance was evaluated by computation of precision and sensitivity percentages, as well as Area under the Receiver Operator Curve (AUROC). Detections were considered positive if they overlapped a ground truth lesion delineation by at least 50% (Intersection over Union = 0.5), and a maximum of 4 detections were generated for each image. Timing experiments were run on a GPU-enabled (Nvidia Tesla V100) machine on unbatched images. Results Over the 10-year observation period, we identified 5,214 women with US images meeting our criterion. Of these, 111 were diagnosed with malignant breast cancer and 333 were selected as non-cases for a total of 444 women. These 444 women had a total of 4,623 ultrasound images with 2,028 benign and 1,431 malignant lesions identified by the study radiologist. For cancerous lesions, the AI algorithm had 8% precision at a sensitivity of 90% on the testing set. For benign lesions, a sensitivity of 90% resulted in 5% precision on the testing set. The AUROC for bounding box detections of cancerous lesions was 0.90. The AUROC for bounding box detections of benign lesions was 0.87. The model made predictions at a rate of 25 frames/second time (38.7 milliseconds per image). Conclusion Detection, segmentation, and cancer classification of breast lesions are possible in clinically-acquired ultrasound images using AI. Based on our timing experiments, the model is capable of detecting and classifying lesions in real-time during ultrasound capture. Model performance is expected to improve as more data becomes available for training. Future work would involve further fine-tuning of the model on portable breast ultrasound images and increasing model evaluation speed in order to assess utility in low-resource populations [1] Wu Y, Kirillov A, Massa F, Lo W-Y, Girshick R. Detectron2. https://github.com/facebookresearch/detectron2. [2] Lin T-Y, Maire M, Belongie S, et al. Microsoft COCO: Common Objects in Context. Computer Vision – ECCV 2014. Springer International Publishing; 2014:740-755. Citation Format: Arianna Bunnell, Dustin Valdez, Thomas Wolfgruber, Aleen Altamirano, Brenda Hernandez, Peter Sadowski, John Shepherd. Artificial Intelligence Detects, Classifies, and Describes Lesions in Clinical Breast Ultrasound Images [abstract]. In: Proceedings of the 2022 San Antonio Breast Cancer Symposium; 2022 Dec 6-10; San Antonio, TX. Philadelphia (PA): AACR; Cancer Res 2023;83(5 Suppl):Abstract nr P3-04-05.
Read full abstract