Abstract

SUMMARYToday, compact and reduced data representations using low rank data approximation are common to represent high‐dimensional data sets in many application areas as for example, genomics, multimedia, quantum chemistry, social networks, or visualization. To produce such low rank data representations, the input data is typically approximated by the so‐called alternating least squares (ALS) algorithms. However, not all of these ALS algorithms are guaranteed to converge. To address this issue, we suggest a new algorithm for the computation of a best rank one approximation of tensors, called alternating singular value decomposition. This method is based on the computation of maximal singular values and the corresponding singular vectors of matrices. We also introduce a modification for this method and the ALS method, which ensures that alternating iterations will always converge to a semi‐maximal point (a critical point in several vector variables is semi‐maximal if it is maximal with respect to each vector variable, while other vector variables are kept fixed). We present several numerical examples that illustrate the computational performance of the new method in comparison to the ALS method. Copyright © 2013 John Wiley & Sons, Ltd.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call