Abstract

In many problems of supervised tensor learning (STL), real world data such as face images or MRI scans are naturally represented as matrices, which are also called as second order tensors. Most existing classifiers based on tensor representation, such as support tensor machine (STM) need to solve iteratively which occupy much time and may suffer from local minima. In this paper, we present a kernel support matrix machine (KSMM) to perform supervised learning when data are represented as matrices. KSMM is a general framework for the construction of matrix-based hyperplane to exploit structural information. We analyze a unifying optimization problem for which we propose an asymptotically convergent algorithm. Theoretical analysis for the generalization bounds is derived based on Rademacher complexity with respect to a probability distribution. We demonstrate the merits of the proposed method by exhaustive experiments on both simulation study and a number of real-word datasets from a variety of application domains.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call