Abstract
Machine learning offers immense potential as a transformative tool capable of reshaping optical microscopy and quantitative modeling in cell biology. Here we exemplify this potential through the development of a generative adversarial network (GAN) designed to comprehend and predict cell traction force maps. Empowered by a hybrid dataset from traction force microscopy (TFM) and phase-field modeling (PFM), the GAN learns the intricacies of the traction force maps of contractile cells in complex chemomechanical environments, with the sole input being the phase-contrast images of the cells. The trained GAN accurately predicts collective durotaxis by leveraging the learned asymmetric traction force maps, while also unveiling the concealed correlation between substrate stiffness and cell contractility arising from mechanotransduction. Remarkably, despite its foundation in epithelial cell data, our image-learning algorithm can be extended to other contractile cell types by adjusting a single scaling factor. Our approach underscores the potential of synergizing force microscopies and biophysical models with image-based learning, thus catalyzing data-driven scientific revelations in cell mechanobiology.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.