Abstract

Unpaired cross domain medical image translation is a challenging problem as the target image modality cannot be mapped directly from the input data distribution. The best approach used till date, was by Cycle generative adversarial network, which utilized the cycle consistency loss to perform the task. Although efficient, still the resultant image size was small and blurry. Recent trends show that due to change in lifestyle and increased exposure to carcinogens in different forms has increased occurrence of cancer. Statistics show that every one in eight women might develop breast cancer at some stage of her life. Hence, this paper focuses on a combination of two GANs, CycleGAN and Super resolution GAN is used in two stages to obtain translated breast images with improved resolution. The proposed model is tested on images of breast cancer patients to obtain CT Scan using PET scan and vice versa so that the patients are not exposed to an extremely potent dose of radiation. In order to ensure the presence of tumour in the estimated image, a simplified U-net feature extractor is also used. Quantitative studies are carried out for both the stages of simulation to establish the efficiency of the proposed model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call