Abstract
One-step gradient methods of steepest descent, minimal residuals, and minimal errors are considered. The recurrence forms of these methods are described; these forms make it possible to halve the labor input. The presented examples demonstrate that the reduction (the truncation) of a step with the factor of ≈7/8 improves the convergence speed almost up to tht speed of analogous conjugate directions methods (the quickest methods for problems of the general form).
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have