Abstract

In this paper, we present an overview of constrained PARAFAC models where the constraints model linear dependencies among columns of the factor matrices of the tensor decomposition, or alternatively, the pattern of interactions between different modes of the tensor which are captured by the equivalent core tensor. Some tensor prerequisites with a particular emphasis on mode combination using Kronecker products of canonical vectors that makes easier matricization operations, are first introduced. This Kronecker product based approach is also formulated in terms of the index notation, which provides an original and concise formalism for both matricizing tensors and writing tensor models. Then, after a brief reminder of PARAFAC and Tucker models, two families of constrained tensor models, the co-called PARALIND/CONFAC and PARATUCK models, are described in a unified framework, for $N^{th}$ order tensors. New tensor models, called nested Tucker models and block PARALIND/CONFAC models, are also introduced. A link between PARATUCK models and constrained PARAFAC models is then established. Finally, new uniqueness properties of PARATUCK models are deduced from sufficient conditions for essential uniqueness of their associated constrained PARAFAC models.

Highlights

  • 1.1 Introduction Tensor calculus was introduced in differential geometry, at the end of the nineteenth century, and tensor analysis was developed in the context of Einstein’s theory of general relativity, with the introduction of index notation, the so-called Einstein summation convention, at the beginning of the twentieth century, which allows to simplify and shorten physics equations involving tensors

  • The use of the index notation for mode combination based on Kronecker products provides an original and concise way to derive vectorized and matricized forms of tensor models

  • A particular focus on constrained tensor models has been made with a perspective of designing multiple-input multiple-output (MIMO) communication systems with resource allocation

Read more

Summary

Review

1.1 Introduction Tensor calculus was introduced in differential geometry, at the end of the nineteenth century, and tensor analysis was developed in the context of Einstein’s theory of general relativity, with the introduction of index notation, the so-called Einstein summation convention, at the beginning of the twentieth century, which allows to simplify and shorten physics equations involving tensors. Nowadays, (high-order) tensors, called multi-way arrays in the data analysis community, play an important role in many fields of application for representing and analyzing multidimensional data, as in psychometrics, chemometrics, food industry, environmental sciences, signal/image processing, computer vision, neuroscience, information sciences, data mining, pattern recognition, among many others. They are considered as multidimensional arrays of numbers, constituting a generalization of vectors and matrices that are first- and second-order tensors, respectively, to orders higher than two.

Tensor prerequisites
Mode combination
Nested Tucker models
Rewriting of PARATUCK models as constrained PARAFAC models
Comparison of constrained tensor models
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.