Abstract
For tensor regression problem, a novel method, called least square support tensor regression machine based on submatrix of a tensor (LS-STRM-SMT), is proposed. LS-STRM-SMT is a method which can be applied to deal with tensor regression problem more efficiently. First, we develop least square support matrix regression machine (LS-SMRM) and propose a fixed point algorithm to solve it. And then LS-STRM-SMT for tensor data is proposed. Inspired by the relation between photochrome and the gray pictures, we reformulate the tensor sample training set and form the new model (LS-STRM-SMT) for tensor regression problem. With the introduction of projection matrices and another fixed point algorithm, we turn the LS-STRM-SMT model into several related LS-SMRM models which are solved by the algorithm for LS-SMRM. Since the fixed point algorithm is used twice while solving the LS-STRM-SMT problem, we call the algorithm dual fixed point algorithm (DFPA). Our method (LS-STRM-SMT) has been compared with several typical support tensor regression machines (STRMs). From theoretical point of view, our algorithm has less parameters and its computational complexity should be lower, especially when the rank of submatrix K is small. The numerical experiments indicate that our algorithm has a better performance.
Highlights
As we all know, in the past decades, matrices or more generally multiway arrays types of data have an increasing number of applications
We propose least square support matrix regression machine, shorten as LS-SMRM, for the regression problem with matrix input
We propose a novel method for tensor learning regression problems, called least square support tensor regression machine based on submatrix of the tensor (LSSTRM-SMT), which is inspired by the idea of multiple rank multilinear SVM for matrix data classification (MRMLSVM)
Summary
In the past decades, matrices or more generally multiway arrays (tensors) types of data have an increasing number of applications. Several tensor learning for regression approaches [10, 11] appears, but the majority of them dealing with tensor regression problems work on vector spaces that are derived by stacking the original tensor elements in a more or less arbitrary order. This vectorization of data causes many new problems. The rest of the methods mainly take advantage of the decomposition of a matrix [12] or tensor [6], which can reduce the high computational complexity as well as high dimensionality at the expense of slight decline of accuracy, but the structural information is destroyed totally.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.