Abstract

Segmentation of regions of interest (ROIs) in medical images is an important step for image analysis in computer-aided diagnosis systems. In recent years, segmentation methods based on fully convolutional networks (FCNs) have achieved great success in general images. FCN performance is primarily due to it leveraging large labeled datasets to hierarchically learn the features that correspond to the shallow appearance as well as the deep semantics of the images. However, such dependence on large dataset does not translate well into medical images where there is a scarcity of annotated medical training data, and FCN results in coarse ROI detections and poor boundary definitions. To overcome this limitation, medical image-specific FCN methods have been introduced with post-processing techniques to refine the segmentation results; however, the performance of these methods is reliant on the appropriate tuning of a large number of parameters and dependence on data-specific post-processing techniques. In this study, we leverage the state-of-the-art image feature learning method of generative adversarial network (GAN) for its inherent ability to produce consistent and realistic images features by using deep neural networks and adversarial learning concept. We improve upon GAN such that ROI features can be learned at different levels of complexities (simple and complex), in a controlled manner, via our proposed dual-path adversarial learning (DAL). The outputs from our DAL are then augmented to the learned ROI features into the existing FCN training data, which increases the overall feature diversity. We conducted experiments on three public datasets with a variety of visual characteristics. Our results demonstrate that our DAL can improve FCN-based segmentation methods and outperform or be competitive in performances to the state-of-the-art methods without using medical image-specific optimizations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call