Abstract
Segmentation of the breast ultrasound (BUS) image is an important step for subsequent assessment and diagnosis of breast lesions. Recently, Deep-learning-based methods have achieved satisfactory performance in many computer vision tasks, especially in medical image segmentation. Nevertheless, those methods always require a large number of pixel-wise labeled data that is expensive in medical practices. In this study, we propose a new segmentation method by dense prediction and local fusion of superpixels for breast anatomy with scarce labeled data. First, the proposed method generates superpixels from the BUS image enhanced by histogram equalization, a bilateral filter, and a pyramid mean shift filter. Second, using a convolutional neural network (CNN) and distance metric learning-based classifier, the superpixels are projected onto the embedding space and then classified by calculating the distance between superpixels' embeddings and the centers of categories. By using superpixels, we can generate a large number of training samples from each BUS image. Therefore, the problem of the scarcity of labeled data can be better solved. To avoid the misclassification of the superpixels, K-nearest neighbor (KNN) is used to reclassify the superpixels within every local region based on the spatial relationships among them. Fivefold cross-validation was taken and the experimental results show that our method outperforms several often used deep-learning methods under the condition of the absence of a large number of labeled data (48 BUS images for training and 12 BUS images for testing).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Instrumentation and Measurement
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.