Abstract

This paper develops a nonlinear model averaging method (NLMA) and constructs the corresponding weight choosing criterion which we call NLC criterion for models that are nonlinear with respect to their parameters. NLC can be applied as an information criterion to model selection for nonlinear regression models as well. NLC is based on the idea of Mallows Cp which can be regarded as an unbiased estimator of the mean squared error (MSE) and takes the nonlinearity of the models into consideration. Under some regularity conditions, we investigate into the properties of NLMA and NLC. We show that the weight vector chosen by minimizing NLC converges in probability to the optimal weight vector that minimizes the risk. It’s also proved that the NLMA estimator based on the weight vector selected by NLC minimizes the risk of estimation asymptotically. Furthermore, the distribution of NLMA estimator is derived. For finite sample property, simulation results show that NLMA performs much better than every single-model approximation and other model selection and model averagingmethods, achieving relatively smaller sample MSE.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call