Abstract
We investigate stochastic gradient decent (SGD) for solving full infinite dimensional ill-posed problems in Hilbert spaces. We allow for batch-size versions of SGD where the randomly chosen batches incur noise fluctuations. Based on the corresponding bias-variance decomposition we provide bounds for the root mean squared error. These bounds take into account the discretization levels, the decay of the step-size, which is more flexible than in existing results, and the underlying smoothness in terms of general source conditions. This allows to apply SGD to severely ill-posed problems. The obtained error bounds exhibit three stages of the performance of SGD. In particular, the pre-asymptotic behavior can be well seen. Some numerical studies verify the theoretical predictions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.