Abstract

<abstract><p>Consider a linear system $ Ax = b $ where the coefficient matrix $ A $ is rectangular and of full-column rank. We propose an iterative algorithm for solving this linear system, based on gradient-descent optimization technique, aiming to produce a sequence of well-approximate least-squares solutions. Here, we consider least-squares solutions in a full generality, that is, we measure any related error through an arbitrary vector norm induced from weighted positive definite matrices $ W $. It turns out that when the system has a unique solution, the proposed algorithm produces approximated solutions converging to the unique solution. When the system is inconsistent, the sequence of residual norms converges to the weighted least-squares error. Our work includes the usual least-squares solution when $ W = I $. Numerical experiments are performed to validate the capability of the algorithm. Moreover, the performance of this algorithm is better than that of recent gradient-based iterative algorithms in both iteration numbers and computational time.</p></abstract>

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.