Virtual staining of microscopy specimens using GAN-based methods could resolve critical concerns of manual staining process as displayed in recent studies on histopathology images. However, most of these works use basic-GAN framework ignoring microscopy image characteristics and their performance were evaluated based on structural and error statistics (SSIM and PSNR) between synthetic and ground-truth without considering any color space although virtual staining deals with color transformation. Besides, major aspects of staining, like color, contrast, focus, image-realness etc. were totally ignored. However, modifications of GAN architecture for virtual staining might be suitable by incorporating microscopy image features. Further, its successful implementation need to be examined by considering various aspects of staining process. Therefore, we designed, a new feature-fusion-GAN for virtual staining followed by performance assessment by framing a state-of-the-art multi-evaluation framework that includes numerous metrics in -qualitative (based on histogram-correlation of color and brightness); quantitative (SSIM and PSNR); focus aptitude (Brenner metrics and Spectral-Moments); and influence on perception (semantic perceptual influence score). For, experimental validation cell boundaries were highlighted by two different staining reagents, Safranin-O and Toluidine-Blue-O on plant microscopy images of potato tuber. We evaluated virtually stained image quality w.r.t ground-truth in RGB and YCbCr color spaces based on defined metrics and results are found very consistent. Further, impact of feature fusion has been demonstrated. Collectively, this study could be a baseline towards guiding architectural upgrading of deep pipelines for virtual staining of diverse microscopy modalities followed by future benchmark methodology or protocols.
Read full abstract