Abstract

In image registration the optimal transformation parameters of a given transformation model are typically obtained by minimizing a cost function. Stochastic gradient descent (SGD) is an efficient optimization algorithm for image registration. In SGD optimization, stochastic approximations of the cost function derivative are used in each iteration to update the transformation parameters. The stochastic approximation error leads to large variance in the parameters. To enforce convergence nonetheless, SGD methods are typically implemented in combination with a gradually decreasing update step size. However, selecting a proper sequence of step sizes is a major challenge in practice. An alternative strategy in numerical optimization is to use a constant step size and enforce convergence by averaging the parameters obtained by SGD over several iterations. It was proven mathematically that the highest possible rate of convergence is achieved in this way. Inspired by this work, we propose an averaged SGD (Avg-SGD) method for efficient image registration. In the Avg-SGD approach, a constant step size is used, in combination with an exponentially weighted iterate averaging scheme. Experiments on 3D lung CT scans demonstrate the effectiveness of the Avg-SGD method in terms of convergence rate, accuracy and precision.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.