Abstract

Abstract BACKGROUND Tunneling nanotubes (TNTs) are cellular structures connecting cell membranes and mediating intercellular communication. TNTs are manually identified and counted by a trained investigator; however, this process is time-intensive. We therefore sought to develop an automated approach for quantitative analysis of TNTs. METHODS We used the convolutional neural network (U-Net) deep learning model to segment phase contrast microscopy images of both cancer and non-cancer cells. Our method was composed of preprocessing and model development. We developed a new preprocessing method to label TNTs on a pixel-wise basis. Two sequential models were employed to detect TNTs. First, we identified the regions of images with TNTs by implementing a classification algorithm. Second, we fed parts of the image classified as TNT-containing into a modified U-Net model to estimate TNTs on a pixel-wise basis. RESULTS The U-Net model detected 73.3% of human expert-identified TNTs, counted TNTs and cells, and calculated the TNT-to-cell ratio (TCR). We obtained a precision of 0.88, recall of 0.67, and f-1 score of 0.76 on a test data set. The predicted and true TCRs were not significantly different between the training and test data sets. CONCLUSIONS In summary, we report application of an automated model generated by deep learning and trained to accurately label and detect TNTs and cells imaged in culture. Continued application and refinement of this process will provide a new approach to the analysis of TNTs, which form to connect cancer and other cells. This approach has the potential to enhance the drug screens intended to assess therapeutic efficacy of experimental agents, and to reproducibly assess TNTs as a potential biomarker of response to therapy in cancer.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.