Abstract
Abstract Study question Can artificial intelligence (AI) algorithms identify spermatozoa in a semen sample without using training data annotated by professionals? Summary answer Unsupervised AI methods can discriminate the spermatozoon from other cells and debris. These unsupervised methods may have a potential for several applications in reproductive medicine. What is known already Identification of individual sperm is essential to assess a given sperm sample’s motility behaviour. Existing computer-aided systems need training data based on annotations by professionals, which is resource demanding. On the other hand, data analysed by unsupervised machine learning algorithms can improve supervised algorithms that are more stable for clinical applications. Therefore, unsupervised sperm identification can improve computer-aided sperm analysis systems predicting different aspects of sperm samples. Other possible applications are assessing kinematics and counting of spermatozoa. Study design, size, duration Three sperm-like paint images were manipulated using a graphic design tool and used to train our AI system. Two paintings have an ash colour background and randomly distributed white colour circles, and one painting has a predefined pattern of circles. Selected semen sample videos from a public dataset with videos obtained from 85 participants were used to test our AI system. Participants/materials, setting, methods Generative adversarial networks (GANs) have become common AI methods to process data in an unsupervised way. Based on single image frames extracted from videos, a GAN (SinGAN) can be trained to determine and track locations of sperms by translating the real images into localization paintings. The resulting model showed the potential of identifying the presence of sperms without any prior knowledge about data. Main results and the role of chance Visual comparisons of localization paintings to real sperm images show that inverse training of SinGANs can track sperms. Converting colour frames into grayscale frames and using grayscale synthetic sperm-like frames showed the best visual quality of generated localization paintings of sperm frames. Feeding real sperm video frames to the SinGAN at different scaling factors, which is defining the resolution of the input image, showed different quality levels of generated sperm localization paintings. A sperm frame given to the algorithm with a scaling factor of one leads to random sperm tracking, while the scales two to four result in more accurate localization maps than scaling levels five to eight. In contrast, scales from six to eight result in an output close to the input frame. The proposed method is robust in terms of the number of spermatozoa, meaning that the detection works well for samples with a low or high sperm count. For visual comparisons, visit our Github page: https://vlbthambawita.github.io/singan-sperm/. The sperm tracking speed of our SinGAN using an NVIDIA 1080 graphic processing unit, is around 17 frames per second, which can be improved by using parallel video processing capabilities. This shows the capability of using this method for real-time analysis. Limitations, reasons for caution Unsupervised methods are hard to train, and the results need human verification. The proposed method will need quality control and must be standardized. Unsupervised sperm tracking SinGAN may identify blurry bright spots as non-existing sperm heads which may restrict the use of SinGAN sperm tracking for sperm counting. Wider implications of the findings: Assessment of semen samples according to the WHO guidelines is subjective and resource-demanding. This unsupervised model might be used to develop new systems for less time-consuming and more accurate evaluation of semen samples. It may also be used for real-time analysis of prepared spermatozoa for use in assisted reproduction technology. Trial registration number N/A
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.