Cell-based medicinal products (CBMPs) are a growing class of therapeutics that promise new treatments for complex and rare diseases. Given the inherent complexity of the whole human cells comprising CBMPs, there is a need for robust and fast analytical methods for characterization, process monitoring, and quality control (QC) testing during their manufacture. Existing techniques to evaluate and monitor cell quality typically constitute labor-intensive, expensive, and highly specific staining assays. In this work, we combine image-based deep learning with flow imaging microscopy (FIM) to predict cell health metrics using cellular morphology “fingerprints” extracted from images of unstained Jurkat cells (immortalized human T-lymphocyte cells). A supervised (i.e., algorithm trained with human-generated labels for images) fingerprinting algorithm, trained on images of unstained healthy and dead cells, provides a robust stain-free, non-invasive, and non-destructive method for determining cell viability. Results from the stain-free method are in good agreement with traditional stain-based cytometric viability measurements. Additionally, when trained with images of healthy cells, dead cells and cells undergoing chemically induced apoptosis, the supervised fingerprinting algorithm is able to distinguish between the three cell states, and the results are independent of specific treatments or signaling pathways. We then show that an unsupervised variational autoencoder (VAE) algorithm trained on the same images, but without human-generated labels, is able to distinguish between samples of healthy, dead and apoptotic cells along with cellular debris based on learned morphological features and without human input. With this, we demonstrate that VAEs are a powerful exploratory technique that can be used as a process monitoring analytical tool.
Read full abstract