Abstract
When a tensor is partitioned into subtensors, some tensor norms of these subtensors form a tensor called a norm compression tensor. Norm compression inequalities for tensors focus on the relation of the norm of this compressed tensor to the norm of the original tensor. We prove that for the tensor spectral norm, the norm of the compressed tensor is an upper bound of the norm of the original tensor. This result can be extended to a general class of tensor spectral norms. We discuss various applications of norm compression inequalities for tensors. These inequalities improve many existing bounds of tensor norms in the literature, in particular tightening the general bound of the tensor spectral norm via tensor partitions. We study the extremal ratio between the spectral norm and the Frobenius norm of a tensor space, provide a general way to estimate its upper bound, and in particular, improve the current best upper bound for third order nonnegative tensors and symmetric tensors. We also propose a faster approach to estimate the spectral norm of a large tensor or matrix via sequential norm compression inequalities with theoretical and numerical evidence. For instance, the complexity of our algorithm for the matrix spectral norm is Oleft( n^{2+epsilon }right) where epsilon ranges from 0 to 1 depending on the partition and the estimate ranges correspondingly from some close upper bound to the exact spectral norm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.