The number of breast cancer patients has increased each year, and the demand for breast cancer detection has become quite large. There are many common breast cancer diagnostic tools. The latest automated whole breast ultrasound (ABUS) technology can obtain a complete breast tissue structure, which improves breast cancer detection technology. However, due to the large amount of ABUS image data, manual interpretation is time-consuming and labor-intensive. If there are lesions in multiple images, there may be some omissions. In addition, if further volume information or the three-dimensional shape of the lesion is needed for therapy, it is necessary to manually segment each lesion, which is inefficient for diagnosis. Therefore, automatic lesion segmentation for ABUS is an important issue for guiding therapy. Due to the amount of speckle noise in an ultrasonic image and the low contrast of the lesion boundary, it is quite difficult to automatically segment the lesion. To address the above challenges, this study proposes an automated lesion segmentation algorithm. The architecture of the proposed algorithm can be divided into four parts: (I) volume of interest selection, (II) preprocessing, (III) segmentation, and (IV) visualization. A volume of interest (VOI) is automatically selected first via a three-dimensional level-set, and then the method uses anisotropic diffusion to address the speckled noise and intensity inhomogeneity correction to eliminate shadowing artifacts before the adaptive distance regularization level set method (DRLSE) conducts segmentation. Finally, the two-dimensional segmented images are reconstructed for visualization in the three-dimensional space. The ground truth is delineated by two radiologists with more than 10 years of experience in breast sonography. In this study, three performance assessments are carried out to evaluate the effectiveness of the proposed algorithm. The first assessment is the similarity measurement. The second assessment is the comparison of the results of the proposed algorithm and the Chan-Vese level set method. The third assessment is the volume estimation of phantom cases. In this study, in the 2D validation of the first assessment, the area Dice similarity coefficients of the real cases named cases A, real cases B and phantoms are 0.84±0.02, 0.86±0.03 and 0.92±0.02, respectively. The overlap fraction (OF) and overlap value (OV) of the real cases A are 0.84±0.06 and 0.78±0.04, real case B are 0.91±0.04 and 0.82±0.05, respectively. The overlap fraction (OF) and overlap value (OV) of the phantoms are 0.95±0.02 and 0.92±0.03, respectively. In the 3D validation, the volume Dice similarity coefficients of the real cases A, real cases B and phantoms are 0.85±0.02, 0.89±0.04 and 0.94±0.02, respectively. The overlap fraction (OF) and overlap value (OV) of the real cases A are 0.82±0.06 and 0.79±0.04, real cases B are 0.92±0.04 and 0.85±0.07, respectively. The overlap fraction (OF) and overlap value (OV) of the phantoms are 0.95±0.01 and 0.93±0.04, respectively. Therefore, the proposed algorithm is highly reliable in most cases. In the second assessment, compared with Chan-Vese level set method, the Dice of the proposed algorithm in real cases A, real cases B and phantoms are 0.84±0.02, 0.86±0.03 and 0.92±0.02, respectively. The Dice of Chan-Vese level set in real cases A, real cases B and phantoms are 0.65±0.23, 0.69±0.14 and 0.76±0.14, respectively. The Dice performance of different methods on segmentation shows a highly significant impact (P<0.01). The results show that the proposed algorithm is more accurate than Chan-Vese level set method. In the third assessment, the Spearman's correlation coefficient between the segmented volumes and the corresponding ground truth volumes is ρ=0.929 (P=0.01). In summary, the proposed method can batch process ABUS images, segment lesions, calculate their volumes and visualize lesions to facilitate observation by radiologists and physicians.
Read full abstract