Abstract
ABSTRACTTensor data are multi‐dimensional arrays. Low‐rank decomposition‐based regression methods with tensor predictors exploit the structural information in tensor predictors while significantly reducing the number of parameters in tensor regression. We propose a method named (Noise Augmentation for regularization on Core Tensor in Tucker decomposition) to regularize the parameters in tensor regression (TR), coupled with Tucker decomposition. We establish theoretically that achieves exact regularization on the core tensor from the Tucker decomposition in linear TR and generalized linear TR. To our knowledge, is the first Tucker decomposition‐based regularization method in TR to achieve regularization in core tensors. is implemented through an iterative procedure and involves two straightforward steps in each iteration—generating noisy data based on the core tensor from the Tucker decomposition of the updated parameter estimate and running a regular GLM on noise‐augmented data on vectorized predictors. We demonstrate the implementation of and its regularization effect in both simulation studies and real data applications. The results suggest that can improve predictions compared with other decomposition‐based TR approaches, with or without regularization and it identifies important predictors though not designed for that purpose.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have