Abstract

Among many algorithms, gradient descent algorithm (GDA) is a simple tool to derive an optimal quantity in dealing with an optimization problem in the linear space. Apart from the initial value, the step size has a great impact on the convergence rate of this algorithm. Its affect on the geometric structure of the consecutive configurations is more crucial if one works with an optimization problem in the statistical shape analysis. In other words, if the step size of the GDA is not properly tuned, the geometry might not be preserved while the algorithm is moving forward to reach an optimal mean shape. In order to improve the performance of the GDA, we introduce a dynamic step size and a new criterion both to check the geometry in each step of the algorithm and to accelerate the convergence rate. These lead to a new robust algorithm on deriving the intrinsic mean on the shape space. We compare the performance of our proposed procedure to the usual GDA using a real shape data accompanied with simulation studies.

Highlights

  • It is known from elementary statistics that the mean of a set of data lying on an Euclidean space is the minimizer of the sum of squared Euclidean distance of a fixed point to the observations on hand

  • Since the geometrical form is vital in the statistical shape analysis, we aim to extend the gradient descent algorithm (GDA) to overcome such possible problem

  • Assuming the Procrustes mean for this data set is well describing the average geometrical feature of all 58 adult healthy brains, the intrinsic mean shape returned by the robust gradient descent algorithm (RGDA) method is a better representative of the mean shape than that given by GDA

Read more

Summary

Introduction

It is known from elementary statistics that the mean of a set of data lying on an Euclidean space is the minimizer of the sum of squared Euclidean distance of a fixed point to the observations on hand. We will demonstrate how, at each stage of the GDA, an non-tuned step size both increases the time of convergence and fails to preserve geometrical structures of the objects before reaching the intrinsic mean shape. This is due to the fact that the optimum choice of step size parameter accelerate the convergence rate and maintain the geometrical structure of the shape under study. We propose a new procedure, called robust gradient descent algorithm (RGDA), which is more resistant than the standard GDA Performance of employing both the GDA and RGDA on the real data set and in a simulation study is evaluated in Sect.

A brief review of shape analysis
GDA and its extension on shape space
Robust gradient descent algorithm to derive mean shape
Simulation studies and real data analysis
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call