Abstract
We tackle the problem of template estimation when data have been randomly deformed under a group action in the presence of noise. In order to estimate the template, one often minimizes the variance when the influence of the transformations have been removed (computation of the Fr{\'e}chet mean in the quotient space). The consistency bias is defined as the distance (possibly zero) between the orbit of the template and the orbit of one element which minimizes the variance. In the first part, we restrict ourselves to isometric group action, in this case the Hilbertian distance is invariant under the group action. We establish an asymptotic behavior of the consistency bias which is linear with respect to the noise level. As a result the inconsistency is unavoidable as soon as the noise is enough. In practice, template estimation with a finite sample is often done with an algorithm called "max-max". In the second part, also in the case of isometric group finite, we show the convergence of this algorithm to an empirical Karcher mean. Our numerical experiments show that the bias observed in practice can not be attributed to the small sample size or to a convergence problem but is indeed due to the previously studied inconsistency. In a third part, we also present some insights of the case of a non invariant distance with respect to the group action. We will see that the inconsistency still holds as soon as the noise level is large enough. Moreover we prove the inconsistency even when a regularization term is added.
Highlights
A popular algorithm consists in the minimization of the variance, in other words, the computation of the Fréchet mean in quotient space
We start with Theorem 1 which gives us an asymptotic behavior of the consistency bias when the noise level σ tends to infinity
We provided an asymptotic behavior of the consistency bias when the noise level σ tends to infinity in the case of isometric action
Summary
Template estimation is a well known issue in different fields such as statistics on signals [1], shape theory, computational anatomy [2,3,4] etc. A popular algorithm consists in the minimization of the variance, in other words, the computation of the Fréchet mean in quotient space This method has been already proved to be inconsistent [5,6,7]. We estimate the better time translation which aligns f 0 and f 1 and secondly, we compute the L2 -norm after this alignment step On this example, we find that the distance is '0.02. After alignment the distance between f 0 and f 2 is still '0.6 With this new way of comparing functions, the functions f 0 looks like f 1 but do not look like f 2. We precise how to do it in general This idea of using deformations/transformation in order to compare things is not new. It was already proposed by Darcy Thompson [10] in the beginning of the 20th century, in order to classify species
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.