Abstract
Background and ObjectiveAutomated breast ultrasound (ABUS) imaging technology has been widely used in clinical diagnosis. Accurate lesion segmentation in ABUS images is essential in computer-aided diagnosis (CAD) systems. Although deep learning-based approaches have been widely employed in medical image analysis, the large variety of lesions and the imaging interference make ABUS lesion segmentation challenging. MethodsIn this paper, we propose a novel deepest semantically guided multi-scale feature fusion network (DSGMFFN) for lesion segmentation in 2D ABUS slices. In order to cope with the large variety of lesions, a deepest semantically guided decoder (DSGNet) and a multi-scale feature fusion model (MFFM) are designed, where the deepest semantics is fully utilized to guide the decoding and feature fusion. That is, the deepest information is given the highest weight in the feature fusion process, and participates in every decoding stage. Aiming at the challenge of imaging interference, a novel mixed attention mechanism is developed, integrating spatial self-attention and channel self-attention to obtain the correlation among pixels and channels to highlight the lesion region. ResultsThe proposed DSGMFFN is evaluated on 3742 slices of 170 ABUS volumes. The experimental result indicates that DSGMFFN achieves 84.54% and 73.24% in Dice similarity coefficient (DSC) and intersection over union (IoU), respectively. ConclusionsThe proposed method shows better performance than the state-of-the-art methods in ABUS lesion segmentation. Incorrect segmentation caused by lesion variety and imaging interference in ABUS images can be alleviated.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.