Abstract

This paper provides several error estimations for total variation (TV) type regularization, which arises in a series of areas, for instance, signal and imaging processing, machine learning, etc. In this paper, some basic properties of the minimizer for the TV regularization problem such as stability, consistency and convergence rate are fully investigated. Both a priori and a posteriori rules are considered in this paper. Furthermore, an improved convergence rate is given based on the sparsity assumption. The problem under the condition of non-sparsity, which is common in practice, is also discussed; the results of the corresponding convergence rate are also presented under certain mild conditions.

Highlights

  • Error Estimations for Total VariationCompressed sensing [1,2] has gained increasing attention in recent years; it plays an important role in signal processing [3,4], imaging science [5,6] and machine learning [7].Compressed sensing focusses on signals with sparse presentation

  • Like the classical Tikhonov regularization method [19,35,36], we introduce a source condition

  • We study some problems in total variation type regularization

Read more

Summary

Introduction

Compressed sensing [1,2] has gained increasing attention in recent years; it plays an important role in signal processing [3,4], imaging science [5,6] and machine learning [7]. Given some operators K satisfy certain conditions, it is possible to recover a sparse x † ∈ Cn signal with length n by Basis Pursuit (BP) [8], i.e., Type Regularization. The perfect reconstruction result established in sparse regularization can not be applied to theTV type directly, especially when T is ill-posed (T has a nontrivial null space). The linear convergence rate can be derived under the sparsity assumption on Tx † and some suitable conditions for K This requirement of deduction does not depend on the injectivity of K. Last, based on some recent works [37,38,39], which assume the Tx † is not sparse, a convergence rate is given in this case.

Notation
Basic Error Estimations
Stability
Consistency
Convergence Rate
Performance under Sparsity Assumption
Performance if Sparsity Assumption Fails
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call