Abstract

Slide-free microscopy techniques have been proposed for accelerating standard histopathology and intraoperative guidance. One such technology is quantitative oblique back-illumination microscopy (qOBM), which enables real-time, label-free quantitative phase imaging of thick, unsectioned in-vivo and ex-vivo tissues. However, the grayscale phase contrast provided by qOBM differs from the colored histology images familiar to pathologists and clinicians, limiting its current potential for adoption. Here we demonstrate the application of unsupervised deep learning using a Cycleconsistent Generative Adversarial Network (CycleGAN) model to transform qOBM images into virtual hematoxylin and eosin (H&E)-stained images. The models were trained on a dataset of qOBM and H&E images of similar regions in excised brain tissue from a 9 L gliosarcoma rat tumor model. We observed successful qOBM-to-H&E conversion of both uninvolved and tumor-containing specimens, as demonstrated by a classifier test. We describe several crucial preprocessing steps that improve the quality of conversion, such as intensity inversion, pixel harmonization, and color normalization. This unsupervised deep learning framework does exhibit occasional subpar performance; for example, as with GANs in general, it can create so-called “hallucinations”, displaying features not actually present in the original qOBM images. We anticipate that this behavior can be minimized with more extensive training and deployment of advanced ML techniques, and that virtual-H&E-converted qOBM imaging will prove safe and appropriate for rapid tissue imaging applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call