Abstract

We show that the LSPIA method for curve and surface approximation, which was introduced by Deng and Lin (2014), is equivalent to a gradient descent method. We also note that Deng and Lin’s results concerning feasible values of the stepsize are directly implied by classical results about convergence properties of the gradient descent method. We propose a modification based on stochastic gradient descent, which lends itself to a realization that employs the technology of neural networks. In addition, we show how to incorporate the optimization of the parameterization of the given data into this framework via parameter correction (PC). This leads to the new LSPIA-PC method and its neural-network based implementation. Numerical experiments indicate that it gives better results than LSPIA with comparable computational costs.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.