Abstract

This article proposes a confidence-based approach for combining two visual tracking techniques to minimize the influence of unforeseen visual tracking failures to achieve uninterrupted vision-based control. Despite research efforts in vision-guided micromanipulation, existing systems are not designed to overcome visual tracking failures, such as inconsistent illumination condition, regional occlusion, unknown structures, and nonhomogenous background scene. There remains a gap in expanding current procedures beyond the laboratory environment for practical deployment of vision-guided micromanipulation system. A hybrid tracking method, which combines motion-cue feature detection and score-based template matching, is incorporated in an uncalibrated vision-guided workflow capable of self-initializing and recovery during the micromanipulation. Weighted average, based on the respective confidence indices of the motion-cue feature localization and template-based trackers, is inferred from the statistical accuracy of feature locations and the similarity score-based template matches. Results suggest improvement of the tracking performance using hybrid tracking under the conditions. The mean errors of hybrid tracking are maintained at subpixel level under adverse experimental conditions while the original template matching approach has mean errors of 1.53, 1.73, and 2.08 pixels. The method is also demonstrated to be robust in the nonhomogeneous scene with an array of plant cells. By proposing a self-contained fusion method that overcomes unforeseen visual tracking failures using pure vision approach, we demonstrated the robustness in our developed low-cost micromanipulation platform. Note to Practitioners —Cell manipulation is traditionally done in highly specialized facilities and controlled environment. Existing vision-based methods do not readily fulfill the need for the unique requirements in cell manipulation including prospective plant cell-related applications. There is a need for robust visual tracking to overcome visual tracking failure during the automated vision-guided micromanipulation. To address the gap in maintaining continuous tracking for vision-guided micromanipulation under unforeseen visual tracking failures, we proposed a purely visual data-driven hybrid tracking approach. Our proposed confidence-based approach combines two tracking techniques to minimize the influence of scene uncertainties, hence, achieving uninterrupted vision-based control. Because of its readily deployable design, the method can be generalized for a wide range of vision-guided micromanipulation applications. This method has the potential to significantly expand the capability of cell manipulation technology to even include prospective applications associated with plant cells, which are yet to be explored.

Highlights

  • T HE importance of robotic micromanipulation system is well evidenced in its contribution toward the advancement of micromanipulation technology

  • The formalized method addresses the gap identified in the above discussion while maintaining the relevance to existing vision-based control methods. This is especially relevant to biomedical applications, such as embryo biopsy, blastomere isolation or preimplantation genetic diagnosis (PGD) [40]–[42], and prospective plant cell manipulation applications [8]–[11], where the imaged scene could be challenging for visual tracking due to the nonhomogenous scene with an array of irregular cell dimensions

  • The extent of adverse influence on the performance of visual tracking is observed by intentionally varying the illumination conditions and including regional artifacts. The former results in the unpredictable intensity distribution in the scene while the latter leads to an unforeseen regional occlusion. Both conditions are common in the deployment of the portable microscope in an uncontrolled environment, which is in alignment with our research vision toward ubiquitous micromanipulation beyond the laboratory setting

Read more

Summary

INTRODUCTION

T HE importance of robotic micromanipulation system is well evidenced in its contribution toward the advancement of micromanipulation technology. Apart from the importance of robotic vision-guided micromanipulation, this work is motived by the existing gap in addressing visual tracking failure. YANG et al.: CONFIDENCE-BASED HYBRID TRACKING TO OVERCOME VISUAL TRACKING FAILURES general deployment outside the laboratory environment All these factors result in a gap in the development of robotic vision-guided micromanipulation including unprecedented applications such as plant cell manipulation [7]–[11]. There is conceptually no need for excessive specification of imaging conditions, tracking requirements, or prior models of the physical setup through calibration This approach overcomes problems in conventional visual tracking associated with the mentioned unforeseen visual uncertainties.

RELATED WORK
Conceptual Overview
Fusion via Normalized Weighted Averaging
Overcoming Visual Tracking Failures
Conditions
RESULTS AND DISCUSSION
Qualitative Observation
Quantitative Evaluation
Demonstrating Robustness in Hybrid Tracking
Validation on Plant Cell Applications
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.