Abstract

We extend the notion of trend filtering to tensors by considering the $k\mathrm{th}$-order Vitali variation – a discretized version of the integral of the absolute value of the $k\mathrm{th}$-order total derivative. We prove adaptive $\ell^0$-rates and not-so-slow $\ell^1$-rates for tensor denoising with trend filtering. For $k={1,2,3,4}$ we prove that the $d$-dimensional margin of a $d$-dimensional tensor can be estimated at the $\ell^0$-rate $n^{-1}$, up to logarithmic terms, if the underlying tensor is a product of $(k-1)\mathrm{th}$-order polynomials on a constant number of hyperrectangles. For general $k$ we prove the $\ell^1$-rate of estimation $n^{- \frac{H(d)+2k-1}{2H(d)+2k-1}}$, up to logarithmic terms, where $H(d)$ is the $d\mathrm{th}$ harmonic number. Thanks to an ANOVA-type of decomposition we can apply these results to the lower dimensional margins of the tensor to prove bounds for denoising the whole tensor. Our tools are interpolating tensors to bound the effective sparsity for $\ell^0$-rates, mesh grids for $\ell^1$-rates and, in the background, the projection arguments by Dalalyan, Hebiri, and Lederer (2017).

Highlights

  • Thanks to an ANOVA-type of decomposition we can apply these results to the lower dimensional margins of the tensor to prove bounds for denoising the whole tensor

  • We propose an ANOVA decomposition to ensure that all the margins of a d-dimensional tensor can be estimated adaptively

  • We prove not-so-slow l1-rates for tensor denoising with trend filtering, see Theorem 3.2

Read more

Summary

Introduction

We show that we can estimate the underlying tensor f 0 in an adaptive manner with a regularized least-squares signal approximator. As regularizer we propose the Vitali variation of the (k − 1)th-order total differences of the candidate estimator for k ≥ 1. We call this regularizer the “kth-order Vitali total variation”. We use the abbreviation TV for “total variation”. This approach extends the idea of “trend filtering” [9, 22] to tensors. We expose the notion of TV regularization, review the literature on adaptive results for TV regularization, explain the concept of adaptation for structured problems, introduce an ANOVA-type of decomposition of a tensor, outline our contributions and present the organization of the paper

TV regularization
Literature review: adaptive results for TV regularization
Adaptation for structured problems
ANOVA decomposition
Contributions
Organization of the paper
Signals supported on d-dimensional tensors
Linear subspaces and orthogonal projections
Estimator
Active sets
Preview of the results
Synthesis form
Dictionary for general d
Adaptivity
Main result
Some definitions
Effective sparsity via interpolating tensors
Requirements on an interpolating tensor
Bound on the effective sparsity for trend filtering
Matching derivatives
Show a bound on the effective sparsity
Mesh grids
The inverse scaling factor when Sis an enlarged mesh grid
Denoising lower-dimensional margins
Margins as lower dimensional objects
The estimator for the lower-dimensional margins
Adaptivity of trend filtering
Not-so-slow rates for trend filtering
Conclusion
Partial integration

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.