Abstract
BackgroundThe scoring of Ki-67 is highly relevant for the diagnosis, classification, prognosis, and treatment in breast invasive ductal carcinoma (IDC). Traditional scoring method of Ki-67 staining followed by manual counting, is time-consumption and inter−/intra observer variability, which may limit its clinical value. Although more and more algorithms and individual platforms have been developed for the assessment of Ki-67 stained images to improve its accuracy level, most of them lack of accurate registration of immunohistochemical (IHC) images and their matched hematoxylin-eosin (HE) images, or did not accurately labelled each positive and negative cell with Ki-67 staining based on whole tissue sections (WTS). In view of this, we introduce an accurate image registration method and an automatic identification and counting software of Ki-67 based on WTS by deep learning.MethodsWe marked 1017 breast IDC whole slide imaging (WSI), established a research workflow based on the (i) identification of IDC area, (ii) registration of HE and IHC slides from the same anatomical region, and (iii) counting of positive Ki-67 staining.ResultsThe accuracy, sensitivity, and specificity levels of identifying breast IDC regions were 89.44, 85.05, and 95.23%, respectively, and the contiguous HE and Ki-67 stained slides perfectly registered. We counted and labelled each cell of 10 Ki-67 slides as standard for testing on WTS, the accuracy by automatic calculation of Ki-67 positive rate in attained IDC was 90.2%. In the human-machine competition of Ki-67 scoring, the average time of 1 slide was 2.3 min with 1 GPU by using this software, and the accuracy was 99.4%, which was over 90% of the results provided by participating doctors.ConclusionsOur study demonstrates the enormous potential of automated quantitative analysis of Ki-67 staining and HE images recognition and registration based on WTS, and the automated scoring of Ki67 can thus successfully address issues of consistency, reproducibility and accuracy. We will provide those labelled images as an open-free platform for researchers to assess the performance of computer algorithms for automated Ki-67 scoring on IHC stained slides.
Highlights
Breast invasive ductal carcinoma (IDC) is the most common malignant tumor in women worldwide, with a trend of younger at diagnosis [1, 2]
Our study demonstrates the enormous potential of automated quantitative analysis of Ki-67 staining and HE images recognition and registration based on whole tissue sections (WTS), and the automated scoring of Ki67 can successfully address issues of consistency, reproducibility and accuracy
We will provide those labelled images as an open-free platform for researchers to assess the performance of computer algorithms for automated Ki-67 scoring on IHC stained slides
Summary
Breast invasive ductal carcinoma (IDC) is the most common malignant tumor in women worldwide, with a trend of younger at diagnosis [1, 2]. In 2018, there were more than 266,000 new cases of breast cancer in women in the United States, accounting for 30% of all malignant tumors in women and far exceeding the second lung cancer (13%) [3]. In both developed and developing countries, the disease ranks as third in the mortality rate among females [2, 3]. The scoring of Ki-67 is highly relevant for the diagnosis, classification, prognosis, and treatment in breast invasive ductal carcinoma (IDC). We introduce an accurate image registration method and an automatic identification and counting software of Ki-67 based on WTS by deep learning
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.