Abstract

Collecting substantial amounts of tissue image data is expensive, time consuming and often restricted by sample availability. This data sparsity limits the use of neural networks for image classification and knowledge extraction for many medical and biological applications. In this work, we present a novel method to procedurally generate synthetic fluorescent multiplex immunohistochemistry (fm-IHC) images using human-defined rules and a conditional Generative Adversarial Network (cGAN). Our method consists of three steps: First, we compress high-resolution fm-IHC images using a cell-based compression method (Cell2Grid) in which every cell is represented by just one pixel. We subsequently train a cGAN to reverse this compression. In a second step, we define a procedural algorithm for the generation of synthetic Cell2Grid images. We therefore (1) define rules for the generation of large-scale tissue morphology, (2) define the relative abundance of cell phenotypes in each tissue segment, and (3) define marker distributions for each phenotype. In a third step, we apply the trained cGAN to convert synthetic Cell2Grid images into fm-IHC images. As a showcase, we generated synthetic fm-IHC images of murine pancreatic islets in different stages of insulitis consisting of seven antibody color channels. We derived rules for the procedural algorithm using only 59 real sample images and trained the cGAN on 1,575 unlabeled fm-IHC image crops. We show that our method provides full control over tissue morphology, cell composition, and image ground truth and can support the training of image classification networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call