BackgroundBreast ultrasound (US) is useful for dense breasts, and the introduction of artificial intelligence (AI)-assisted diagnoses of breast US images should be considered. However, the implementation of AI-based technologies in clinical practice is problematic because of the costs of introducing such approaches to hospital information systems (HISs) and the security risk of connecting HIS to the Internet to access AI services. To solve these problems, we developed a system that applies AI to the analysis of breast US images captured using a smartphone.MethodsTraining data were prepared using 115 images of benign lesions and 201 images of malignant lesions acquired at the Division of Breast Surgery, Gifu University Hospital. YOLOv3 (object detection models) was used to detect lesions on US images. A graphical user interface (GUI) was developed to predict an AI server. A smartphone application was also developed for capturing US images displayed on the HIS monitor with its camera and displaying the prediction results received from the AI server. The sensitivity and specificity of the prediction performed on the AI server and via the smartphone were calculated using 60 images spared from the training.ResultsThe established AI showed 100% sensitivity and 75% specificity for malignant lesions and took 0.2 s per prediction with the AI sever. Prediction using a smartphone required 2 s per prediction and showed 100% sensitivity and 97.5% specificity for malignant lesions.ConclusionsGood-quality predictions were obtained using the AI server. Moreover, the quality of the prediction via the smartphone was slightly better than that on the AI server, which can be safely and inexpensively introduced into HISs.
Read full abstract