Abstract

Convex optimization now plays an essential role in many facets of statistics. We briefly survey some recent developments and describe some implementations of these methods in R . Applications of linear and quadratic programming are introduced including quantile regression, the Huber M-estimator and various penalized regression methods. Applications to additively separable convex problems subject to linear equality and inequality constraints such as nonparametric density estimation and maximum likelihood estimation of general nonparametric mixture models are described, as are several cone programming problems. We focus throughout primarily on implementations in the R environment that rely on solution methods linked to R, like MOSEK by the package Rmosek. Code is provided in R to illustrate several of these problems. Other applications are available in the R package REBayes, dealing with empirical Bayes estimation of nonparametric mixture models.

Highlights

  • Optimality, in statistics as in the rest of life, is probably over-rated; better to be “not bad” most of the time, than perfect once in a while

  • While convexity plays an essential role in many aspects of statistical theory – it is crucial in the theory of estimation and inference for exponential family models, in experimental design, in the underpinnings of the Neyman-Pearson lemma, and in much of modern decision theory – our main objective will be to describe some recent developments in computational statistics that rely on recent algorithmic progress in convex optimization, and to illustrate their implementation in R (R Core Team 2014)

  • Having seen how linear inequality constraints could be incorporated into the Newton framework via log barrier penalties for linear programming, there was an inevitable torrent of work designed to adapt similar methods to other convex optimization settings

Read more

Summary

Introduction

Optimality, in statistics as in the rest of life, is probably over-rated; better to be “not bad” most of the time, than perfect once in a while. Without convexity we risk wandering around in the wilderness always looking for a higher mountain, or a deeper valley. With convexity we can proceed with confidence toward a solution. While convexity plays an essential role in many aspects of statistical theory – it is crucial. In the theory of estimation and inference for exponential family models, in experimental design, in the underpinnings of the Neyman-Pearson lemma, and in much of modern decision theory – our main objective will be to describe some recent developments in computational statistics that rely on recent algorithmic progress in convex optimization, and to illustrate their implementation in R (R Core Team 2014). In the remaining sections we will sketch some basic unifying theory for a variety of convex optimization problems arising in statistics and discuss some aspects of their implementation in R. In the final section we describe some future developments that we would view as desirable

Convex optimization
Linear programming
Quadratic programming
Second-order cone programming
Generalized inequalities and cone programming
Separable convex programs
Convex optimization in R
Separable convex optimization
What’s next?
The Dantzig selector

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.