Abstract
When estimating an unknown function from a data set of n observations, the function is often known to be convex. For example, the long-run average waiting time of a customer in a single server queue is known to be convex in the service rate (Weber 1983) even though there is no closed-form formula for the mean waiting time, and hence, it needs to be estimated from a data set. A computationally efficient way of finding the best fit of the convex function to the data set is to compute the least absolute deviations estimator minimizing the sum of absolute deviations over the set of convex functions. This estimator exhibits numerically preferred behavior since it can be computed faster and for a larger data sets compared to other existing methods (Lim & Luo 2014). In this paper, we establish the validity of the least absolute deviations estimator by proving that the least absolute deviations estimator converges almost surely to the true function as n increases to infinity under modest assumptions.
Highlights
The long-run average waiting time of a customer in a single server queue is known to be convex in the service rate even though there is no closed-form formula for the mean waiting time, and it needs to be estimated from a data set
A computationally efficient way of finding the best fit of the convex function to the data set is to compute the least absolute deviations estimator minimizing the sum of absolute deviations over the set of convex functions
We study the problem of finding the best fit of an unknown convex function f∗ : [0, 1]d → R to a data set of n observations (X1, Y1), . . . , (Xn, Yn), where
Summary
We study the problem of finding the best fit of an unknown convex function f∗ : [0, 1]d → R to a data set of n observations (X1, Y1), . . . , (Xn, Yn), where. Lim & Luo (2014) suggest computing gn : [0, 1]d → R that minimizes the sum of absolute deviations. Numerical results presented in Lim & Luo (2014) suggests that the least squares estimator gn is computed faster and for a larger data sets than the least squares estimator gn. Another advantage of least absolute deviations estimators is that they can provide more robust results because they are not sensitive to outliers in the dataset (Bassett & Koenker 1978, Wagner 1959).
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have