Abstract

We study the finite sample behavior of Lasso-based inference methods such as post double Lasso and debiased Lasso. Empirically and theoretically, we show that these methods can exhibit substantial omitted variable biases (OVBs) due to Lasso not selecting relevant controls. This phenomenon can be systematic and occur even when the coefficients are very sparse and the sample size is large and larger than the number of controls. Interestingly, we also show that the OVBs can remain bounded even if the Lasso prediction errors are unbounded. We compare the Lasso-based inference methods to modern high-dimensional OLS-based methods and provide practical guidance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call