Abstract
This study proposes a novel complete subset averaging (CSA) method for high-dimensional generalized linear models based on a penalized Kullback–Leibler (KL) loss. All models under consideration can be potentially misspecified, and the dimension of covariates is allowed to diverge to infinity. The uniform convergence rate and asymptotic normality of the proposed estimator are established. Moreover, it is asymptotically optimal in terms of achieving the lowest KL loss. To ease the computational burden, we randomly draw a fixed number of subsets from the complete subsets and show their asymptotic equivalence. The Monte Carlo simulation and empirical application demonstrate that the proposed CSA method outperforms popular model-averaging methods.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have