Abstract

Machine learning offers immense potential as a transformative tool capable of reshaping optical microscopy and quantitative modeling in cell biology. Here we exemplify this potential through the development of a generative adversarial network (GAN) designed to comprehend and predict cell traction force maps. Empowered by a hybrid dataset from traction force microscopy (TFM) and phase-field modeling (PFM), the GAN learns the intricacies of the traction force maps of contractile cells in complex chemomechanical environments, with the sole input being the phase-contrast images of the cells. The trained GAN accurately predicts collective durotaxis by leveraging the learned asymmetric traction force maps, while also unveiling the concealed correlation between substrate stiffness and cell contractility arising from mechanotransduction. Remarkably, despite its foundation in epithelial cell data, our image-learning algorithm can be extended to other contractile cell types by adjusting a single scaling factor. Our approach underscores the potential of synergizing force microscopies and biophysical models with image-based learning, thus catalyzing data-driven scientific revelations in cell mechanobiology.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call