Abstract

Lesion detectors based on deep learning can assist doctors in diagnosing diseases. However, the performance of current detectors is likely to be unsatisfactory due to the scarcity of training samples. Therefore, it is beneficial to use image generation to augment the training set of a detector. However, when the imaging texture of the medical image is relatively delicate, the synthesized image generated by an existing method may be too poor in quality to meet the training requirements of the detectors. In this regard, a medical image augmentation method, namely, a texture-constrained multichannel progressive generative adversarial network (TMP-GAN), is proposed in this work. TMP-GAN uses joint training of multiple channels to effectively avoid the typical shortcomings of the current generation methods. It also uses an adversarial learning-based texture discrimination loss to further improve the fidelity of the synthesized images. In addition, TMP-GAN employs a progressive generation mechanism to steadily improve the accuracy of the medical image synthesizer. Experiments on the publicly available dataset CBIS-DDMS and our pancreatic tumor dataset show that the precision/recall/F1-score of the detector trained on the TMP-GAN augmented dataset improves by 2.59%/2.70%/2.77% and 2.44%/2.06%/2.36%, respectively, compared to the optimal results of other data augmentation methods. The FROC curve of the detector is also better than the curve from the contrast-augmented trained dataset. Therefore, we believe the proposed TMP-GAN is a practical technique to efficiently implement lesion detection case studies.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.