Abstract
Tensor train decomposition (TTD) has recently been proposed for high-dimensional signals because it can save significant storage in various signal processing applications. This paper presents a low-rank approximation algorithm of singular value decomposition (SVD) for large-scale matrices in tensor train format (TT-format). The proposed alternating least square block power SVD (ALS-BPSVD) algorithm can reduce the computational complexity by decomposing the large-scale SVD into a low-rank approximation scheme with a fixed-iteration block power method for searching singular values and vectors. Moreover, a low-complexity two-step truncation scheme was proposed to reduce more complexity and facilitate the parallel processing. The proposed ALS-BPSVD algorithm can support the low-rank approximation SVD for matrices with dimension higher than 2 <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">11</sup> × 2 <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">11</sup> . The simulation results show that the ALS-BPSVD achieved up to 21.3 times speed-up compared to the benchmark ALS-SVD algorithm for the random matrices with prescribed singular values.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.