Abstract
Extracting long tracks and lineages from videomicroscopy requires an extremely low error rate, which is challenging on complex data sets of dense or deforming cells. Leveraging temporal context is key to overcoming this challenge. We propose DiSTNet2D, a new deep neural network architecture for two-dimensional (2D) cell segmentation and tracking that leverages both mid- and long-term temporal information. DiSTNet2D considers seven frames at the input and uses a postprocessing procedure that exploits information from the entire video to correct segmentation errors. DiSTNet2D outperforms two recent methods on two experimental data sets, one containing densely packed bacterial cells and the other containing eukaryotic cells. It is integrated into an ImageJ-based graphical user interface for 2D data visualization, curation, and training. Finally, we demonstrate the performance of DiSTNet2D on correlating the size and shape of cells with their transport properties over large statistics, for both bacterial and eukaryotic cells. Published by the American Physical Society 2024
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.