Abstract

Time-lapse microscopy imaging is a crucial technique in biomedical studies for observing cellular behavior over time, providing essential data on cell numbers, sizes, shapes, and interactions. Manual analysis of hundreds or thousands of cells is impractical, necessitating the development of automated cell segmentation approaches. Traditional image processing methods have made significant progress in this area, but the advent of deep learning methods, particularly those using U-Net-based networks, has further enhanced performance in medical and microscopy image segmentation. However, challenges remain, particularly in accurately segmenting touching cells in images with low signal-to-noise ratios. Existing methods often struggle with effectively integrating features across different levels of abstraction. This can lead to model confusion, particularly when important contextual information is lost or the features are not adequately distinguished. The challenge lies in appropriately combining these features to preserve critical details while ensuring robust and accurate segmentation. To address these issues, we propose a novel framework called RA-SE-ASPP-Net, which incorporates Residual Blocks, Attention Mechanism, Squeeze-and-Excitation connection, and Atrous Spatial Pyramid Pooling to achieve precise and robust cell segmentation. We evaluate our proposed architecture using an induced pluripotent stem cell reprogramming dataset, a challenging dataset that has received limited attention in this field. Additionally, we compare our model with different ablation experiments to demonstrate its robustness. The proposed architecture outperforms the baseline models in all evaluated metrics, providing the most accurate semantic segmentation results. Finally, we applied the watershed method to the semantic segmentation results to obtain precise segmentations with specific information for each cell.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.