Abstract

Recently, image analysis techniques have been introduced to automate nematode information assessment. In image analysis-based nematode information assessment, the initial step involves detecting and segmenting C. elegans from microscopic images and network-based methods have been investigated. However, training a network for C. elegans image segmentation is typically associated with the labor-intensive process of pixel-level mask labeling. To address this challenge, we introduced a weakly supervised segmentation method using multiple instance learning (WSM-MIL). The proposed multi-instance weakly supervised segmentation method comprises three key components: a backbone network, a detection branch, and a segmentation branch. In contrast to fully supervised pixel-level annotation, we opted for weakly supervised bounding box-level annotation. This approach reduces the labour cost of annotation to some extent. The approach offers several advantages, such as simplicity, an end-to-end architecture, and good scalability. We conducted experiments comparing the proposed network with benchmark methods, and the results showed that the network exhibits competitive performance in the image segmentation task of C. elegans. The results of this study provide an effective method in the field of biological image analysis, as well as new ideas for solving complex segmentation tasks. The method is not only applicable to the study of C. elegans but also has wide applicability in biological image segmentation problems in other fields.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.