Abstract

Segmenting the hepatic and portal veins is a difficult task, since it has multiple distortions. For effective restoration and to minimize distortions, a multi-stage deep adversarial learning network was proposed. The proposed network provides high reliability in segmenting hepatic and portal veins from distorted Magnetic Resonance (MR) images. The Proposed network directly considers the variation in vascular flow for vessel segmentation. By a two-stage deep adversarial network, which consist of geometric distortion by regularization in the first stage and in the second stage image blurriness is adjusted by a sub-pixel mechanism. To detect the hepatic venous flow, shared clustering was used to correlate the hepatic flow and venous flow. To have better image segmentation by restoration, end-to-end training by image adaptive by denoising was employed. The proposed Denoising Deep Adversarial Network (DDAN) generalizability is shown by curating a distorted TCGA dataset. We perform experiments by segmenting the distorted MR image with estimating depth, semantic interclass decision, and classification. The proposed DDAN is evaluated on a multi-phase open MRI public dataset from a local clinic and Kaggle Br35H. The obtained quantitative and qualitative evaluation data shows the practical applicability of the proposed deep learning model. We observed that the image edges, portal, and hepatic vein scale texture are clearly segmented to assist real-time applications like tumour surgery.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.