Abstract
Drosophila model has been widely used to study cardiac functions, especially combined with optogenetics and optical coherence tomography (OCT) that can continuously acquire mass cross-sectional images of the Drosophila heart in vivo over time. It's urgent to quickly and accurately obtain dynamic Drosophila cardiac parameters such as heartbeat rate for cardiac function quantitative analysis through these mass cross-sectional images of the Drosophila heart. Here we present a deep-learning method that integrates U-Net and generative adversarial network architectures while incorporating residually connected convolutions for high-precision OCT image segmentation of Drosophila heart and dynamic cardiac parameter measurements for optogenetics-OCT-based cardiac function research. We compared our proposed network with the previous approaches and our segmentation results achieved the accuracy of intersection over union and Dice similarity coefficient higher than 98%, which can be used to better quantify dynamic heart parameters and improve the efficiency of Drosophila-model-based cardiac research via the optogenetics-OCT-based platform.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.