Abstract

The positive-definiteness and sparsity are the most important property of high-dimensional precision matrices. To better achieve those property, this paper uses a sparse lasso penalized D-trace loss under the positive-definiteness constraint to estimate high-dimensional precision matrices. This paper derives an efficient accelerated gradient method to solve the challenging optimization problem and establish its converges rate as . The numerical simulations illustrated our method have competitive advantage than other methods.

Highlights

  • In the past twenty years, the most popular direction of statistics is highdimensional data

  • Estimation of high-dimensional precision matrix is increasingly becoming a crucial question in many field

  • To gain a better estimator for high-dimensional precision matrix and achieve the more optimal convergence rate, this paper mainly propose an effective algorithm, an accelerated gradient method ([10]), with fast global convergence rates to solve problem (1)

Read more

Summary

Introduction

In the past twenty years, the most popular direction of statistics is highdimensional data. Zhang et al [9] consider a constrained convex optimization framework for high-dimensional precision matrix They used lasso penalized D-trace loss replace traditional lasso function, and enforced the positive-definite constraint {Θ ≥ ε I} for some arbitrarily small ε > 0. To gain a better estimator for high-dimensional precision matrix and achieve the more optimal convergence rate, this paper mainly propose an effective algorithm, an accelerated gradient method ([10]), with fast global convergence rates to solve problem (1). This method mainly basis on the Nesterov's method for accelerating the gradient method ([11] [12]), showing that by exploiting the special structure of the trace norm, the classical gradient method for smooth problems can be adapted to solve the trace regularized nonsmooth problems. The proof of this theorem is easy by applying the soft-thresholding method

Step Size Estimation
An Accelerate Gradient Method Algorithm
Convergence Analysis
Simulation
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call