Abstract

The morphology of tissues in pathological images has been used routinely by pathologists to assess the degree of malignancy of pancreatic ductal adenocarcinoma (PDAC). Automatic and accurate segmentation of tumor cells and their surrounding tissues is often a crucial step to obtain reliable morphological statistics. Nonetheless, it is still a challenge due to the great variation of appearance and morphology. In this paper, a selected multi-scale attention network (SMANet) is proposed to segment tumor cells, blood vessels, nerves, islets and ducts in pancreatic pathological images. The selected multi-scale attention module is proposed to enhance effective information, supplement useful information and suppress redundant information at different scales from the encoder and decoder. It includes selection unit (SU) module and multi-scale attention (MA) module. The selection unit module can effectively filter features. The multi-scale attention module enhances effective information through spatial attention and channel attention, and combines different level features to supplement useful information. This helps learn the information of different receptive fields to improve the segmentation of tumor cells, blood vessels and nerves. An original-feature fusion unit is also proposed to supplement the original image information to reduce the under-segmentation of small tissues such as islets and ducts. The proposed method outperforms state-of-the-arts deep learning algorithms on our PDAC pathological images and achieves competitive results on the GlaS challenge dataset. The mDice and mIoU have reached 0.769 and 0.665 in our PDAC dataset.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.