Abstract

Active Appearance Models (AAM) are used for annotating or segmenting shapes in biomedical images. Performance relies heavily on the image data used to train the AAM. In this paper we improve the generalization properties of the model by making it robust to slowly varying spatial intensity inhomogeneities which are often seen in Light Sheet Fluorescence Microscopy (LSFM) images. This robustness is achieved by modelling the appearance of an image as a regularized Normalized Gradient Field (rNGF). We perform two experiments to challenge the model. First it is tested using a repeated leave-one-out approach on images with minimal imperfections where the left out images are corrupted by a simulated bias field and segmented using the AAM. Secondly we test the model on LSFM images with common acquisition problems. In both experiments the proposed approach outperforms the often used AAM implementation based on Sum of Squared Differences.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call