Abstract

Missing data is a common problem in real data analysis. In this paper, a Mallows model averaging method based on kernel regression imputation is proposed for the linear regression models with responses missing at random. We prove that our method asymptotically achieves the lowest possible squared error. Compared with the existing model averaging methods, the new method does not require the use of a parameter model to characterize the missing generation mechanism. The Monte Carlo simulation and a practical application demonstrate the usefulness of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call