Abstract

The heart of the fruit fly, Drosophila melanogaster, is a particularly suitable model for cardiac studies. Optical coherence microscopy (OCM) captures in vivo cross-sectional videos of the beating Drosophila heart for cardiac function quantification. To analyze those large-size multi-frame OCM recordings, human labelling has been employed, leading to low efficiency and poor reproducibility. Here, we introduce a robust and accurate automated Drosophila heart segmentation algorithm, called FlyNet 2.0+, which utilizes a long short-term memory (LSTM) convolutional neural network to leverage time series information in the videos, ensuring consistent, high-quality segmentation. We present a dataset of 213 Drosophila heart videos, equivalent to 604,000 cross-sectional images, containing all developmental stages and a wide range of beating patterns, including faster and slower than normal beating, arrhythmic beating, and periods of heart stop to capture these heart dynamics. Each video contains a corresponding ground truth mask. We expect this unique large dataset of the beating Drosophila heart in vivo will enable new deep learning approaches to efficiently characterize heart function to advance cardiac research.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.