Abstract

Cell migration is a pivotal biological process, whose dysregulation is found in many diseases including inflammation and cancer. Advances in microscopy technologies allow now to study cell migration in vitro, within engineered microenvironments that resemble in vivo conditions. However, to capture an entire 3D migration chamber for extended periods of time and with high temporal resolution, images are generally acquired with low resolution, which poses a challenge for data analysis. Indeed, cell detection and tracking are hampered due to the large pixel size (i.e., cell diameter down to 2 pixels), the possible low signal-to-noise ratio, and distortions in the cell shape due to changes in the z-axis position. Although fluorescent staining can be used to facilitate cell detection, it may alter cell behavior and it may suffer from fluorescence loss over time (photobleaching).Here we describe a protocol that employs an established deep learning method (U-NET), to specifically convert transmitted light (TL) signal from unlabeled cells imaged with low resolution to a fluorescent-like signal (class 1 probability). We demonstrate its application to study cancer cell migration, obtaining a significant improvement in tracking accuracy, while not suffering from photobleaching. This is reflected in the possibility of tracking cells for three-fold longer periods of time. To facilitate the application of the protocol we provide WID-U, an open-source plugin for FIJI and Imaris imaging software, the training dataset used in this paper, and the code to train the network for custom experimental settings.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.