Abstract Mammography and ultrasound are widely used for breast cancer diagnosis. In 2003, the American Society of Radiological Surgery created “Breast Imaging Reporting and Data System (BI-RADS)” to standardize the terms in breast ultrasonic diagnosis. Although BI-RADS categorization of ultrasound is globally used, quality consistency remains an urgent task in breast ultrasound diagnosis. Although there are such quality control problems, the importance of ultrasonic examination in breast cancer diagnosis is increasing. "Comparative Exam for Verifying the Effectiveness of Ultrasound in Breast Cancer Screening (Japan Strategic Anti-cancer Randomized Trial: J-START )”, a randomized controlled trial, showed that breast screening by combined application of mammography and ultrasound was significantly better in detecting early breast cancer compared to mammography alone among women in their 40s. Despite some argument about the increase of false positive cases, breast screening is becoming more important and popular in the countermeasure-type screening settings in Japan. Moreover, the education of doctors and technicians responsible for ultrasonographic screening and improvement of examination accuracy is an urgent matter. Since the accuracy of the breast ultrasound examination depends on the equipment, environment, technicians and doctors, it is necessary to solve the problem in quality control. Therefore, we are collaborating with a company which specializes in Artificial Intelligence (AI) to develop a “quality-controlled” breast ultrasonographic diagnosis system by using AI technology. The system can detect the presence of a lesion and make diagnosis real-time during the ultrasonic examination. This system uses a technology called Convolutional Neural Network, which is frequently used in image analysis by AI. We selected ultrasonographic images for machine learning from patients who underwent ultrasonographic screening between January 2016 and December 2018 in Keio University Hospital, Japan. All patients who were diagnosed as malignant underwent biopsy and were diagnosed histologically. The patients who were diagnosed as benign either underwent biopsy or a half-year or more observation. Trained physicians added annotations to each selected image. Annotation was added on 901 images, of which 630 images (138 images of normal mammary gland, 242 images of benign tumor, 250 images of malignant tumor) were used as training data. We tested the system by using 271 images(70 images of normal mammary gland, 97 images of benign tumors, 104 images of malignant tumors) as test data. As a result, sensitivity of mass (benign or malignant) detection among normal breast was 91.8%, and specificity was 91.4%. The sensitivity in the detection of malignant lesion among masses was 86.2%, and the specificity was 88.4%. In conclusion, we succeeded to develop a real time working, quality-controlled, breast ultrasonographic diagnosis system using artificial intelligence. We plan to improve both sensitivity and specificity by adding more learning data for clinical application in near future. Citation Format: Masayuki Kikuchi, Tetsu Hayashida, Rurina Watanuki, Ayako Nakashoji, Yuko Kawai, Aiko Nagayama, Tomoko Seki, Maiko Takahashi, Yuko Kitagawa. Diagnostic system of breast ultrasound images using Convolutional Neural Network [abstract]. In: Proceedings of the 2019 San Antonio Breast Cancer Symposium; 2019 Dec 10-14; San Antonio, TX. Philadelphia (PA): AACR; Cancer Res 2020;80(4 Suppl):Abstract nr P1-02-09.