Abstract

Abstract Study question Is it possible to automate the process of detecting individual blastomeres within a 4-cell embryo? Summary answer Deep learning models are capable of identifying individual cells in single focal plane images of 4-cell embryos. What is known already As individual blastomeres within a 4-cell embryo maintain totipotency, their intercellular junctions are critical in maintaining and directing communication. These junctions are determined by the zygote’s cleavage patterns, and can affect the overall embryo ‘shape’, which can be described as either ‘tetrahedral’ or ‘planar’. Planar embryos carry significantly worse outcomes both in the short and long term, such as more compromised blastulation, clinical pregnancy and live birth rates. Therefore, more accurate identification of cell borders at the 4-cell stage may contribute to improved classification of cell shape and embryo visualisation. Study design, size, duration This was a retrospective cohort analysis of 222 single focal plane images from 3 clinics. Each image captured an embryo at the 4-cell stage and was taken using the Embryoscope™ time-lapse incubator at the central focal plane. Images from two of the clinics were split into training (n = 161) and validation (n = 17) sets. Images from the third clinic formed a blind testing set (n = 44). Participants/materials, setting, methods Ground truth masks were manually created by two human operators using the VGG Image Annotator software. A MaskRCNN neural network model with a pre-trained ResNet–50 backbone was trained to segment individual blastomeres from training images. Data augmentation (flips, rotations, Gaussian noise, cropping, brightness changes and optical distortion) were used during the training process. The model’s performance was evaluated using the IoU metric (a measure of overlap between model-predicted and human-annotated masks). Main results and the role of chance The model was evaluated on a blind test set of 44 images. The model had a mean IoU of 0.92 for individual cells (Standard Deviation (SD) = 0.05) with precision and sensitivity of 0.95 and 0.97 respectively. The mean IoU for the entire embryo (in relation to all 4 blastomeres combined) was 0.92 (SD 0.02). Furthermore, the model was able to count the number of cells in the images with 70% accuracy, and deviating by no more than 1 cell in each error. The nature of these errors can be broken down into the detection of fragmentation as a cell (2 cases); the detection of two cells as one (1 case); the cell being directly under another cell (4 cases); and duplicate detection of the same cell (6 cases). This last issue could be resolved by rejecting detections with significant overlap. Our results demonstrate that our model can be used across different clinics. Limitations, reasons for caution Inaccuracies in segmentation and cell counting sometimes occurred when a cell’s borders were unclear or obscured (e.g. in a different focal plane). The inclusion of multiple focal planes will be key for improving performance. Moreover, as only one focal plane was used, ambiguous cases were annotated with a ‘best guess’. Wider implications of the findings: The creation of a model capable of detecting individual cells would be highly beneficial in the IVF industry. Aside from automating laborious processes for embryologists, it may also prove a useful tool for future research such as in identifying intercellular contact points or rendering three-dimensional embryo visualisations. Trial registration number N/A

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.