Abstract

BackgroundStudying how individual cells spatially and temporally organize within the embryo is a fundamental issue in modern developmental biology to better understand the first stages of embryogenesis. In order to perform high-throughput analyses in three-dimensional microscopic images, it is essential to be able to automatically segment, classify and track cell nuclei. Many 3D/4D segmentation and tracking algorithms have been reported in the literature. Most of them are specific to particular models or acquisition systems and often require the fine tuning of parameters.ResultsWe present a new automatic algorithm to segment and simultaneously classify cell nuclei in 3D/4D images. Segmentation relies on training samples that are interactively provided by the user and on an iterative thresholding process. This algorithm can correctly segment nuclei even when they are touching, and remains effective under temporal and spatial intensity variations. The segmentation is coupled to a classification of nuclei according to cell cycle phases, allowing biologists to quantify the effect of genetic perturbations and drug treatments. Robust 3D geometrical shape descriptors are used as training features for classification. Segmentation and classification results of three complete datasets are presented. In our working dataset of the Caenorhabditis elegans embryo, only 21 nuclei out of 3,585 were not detected, the overall F-score for segmentation reached 0.99, and more than 95% of the nuclei were classified in the correct cell cycle phase. No merging of nuclei was found.ConclusionWe developed a novel generic algorithm for segmentation and classification in 3D images. The method, referred to as Adaptive Generic Iterative Thresholding Algorithm (AGITA), is freely available as an ImageJ plug-in.

Highlights

  • Studying how individual cells spatially and temporally organize within the embryo is a fundamental issue in modern developmental biology to better understand the first stages of embryogenesis

  • We present the results of applying our novel algorithm to three different datasets containing embryo nuclei from C. elegans, Drosophila and 3D simulated data [11]

  • Our original motivation was to automate the segmentation of nuclei in the C. elegans dataset where none of the available methods had provided satisfactory results

Read more

Summary

Introduction

Studying how individual cells spatially and temporally organize within the embryo is a fundamental issue in modern developmental biology to better understand the first stages of embryogenesis. To perform systematic studies and highthroughput analyses, automated methods that quantify nuclei over time and reconstruct cell lineages are required. For accurate geometrical and morphological analyses, a complete segmentation of nuclei is For these reasons, many segmentation and/or tracking methods have been developed. Segmentation of nuclei by 2D detections using difference of Gaussians and 3D reconstruction based on Bayesian features was used by Santella [4] Another algorithm based on a Bayesian estimation framework for tracking was proposed by Carranza et al, where the nuclei were detected using an h-dome transform [5]. Multiple level sets and background/foreground detection was proposed by Chinta et al [6] Most of these algorithms require finetuning of many parameters and are often only successful for dedicated applications and using specific acquisition systems or labeling protocols

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.