Abstract

Initially, it is proved that the EM (estimate, maximize) and OSL (one-step-late) algorithms, when applied to ridge regression problems, are special cases of the so-called linear stationary methods of the first degree for the underlying system of linear equations. It is shown that, although the EM and OSL algorithms converge, their optimum extrapolated counterparts have faster convergence. Using an incomplete data argument, an alternative interpretation of the extrapolated methods is given, which allows the full potential of optimum extrapolated methods to be exploited.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call