Abstract

Drosophila model has been widely used to study cardiac functions, especially combined with optogenetics and optical coherence tomography (OCT) that can continuously acquire mass cross-sectional images of the Drosophila heart in vivo over time. It's urgent to quickly and accurately obtain dynamic Drosophila cardiac parameters such as heartbeat rate for cardiac function quantitative analysis through these mass cross-sectional images of the Drosophila heart. Here we present a deep-learning method that integrates U-Net and generative adversarial network architectures while incorporating residually connected convolutions for high-precision OCT image segmentation of Drosophila heart and dynamic cardiac parameter measurements for optogenetics-OCT-based cardiac function research. We compared our proposed network with the previous approaches and our segmentation results achieved the accuracy of intersection over union and Dice similarity coefficient higher than 98%, which can be used to better quantify dynamic heart parameters and improve the efficiency of Drosophila-model-based cardiac research via the optogenetics-OCT-based platform.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call