To address model uncertainty under flexible loss functions in prediction problems, we propose a model averaging method that accommodates various loss functions, including asymmetric linear and quadratic loss functions as well as many other asymmetric/symmetric loss functions as special cases. The flexible loss function allows the proposed method to average a large range of models such as the quantile and expectile regression models. To determine the weights of the candidate models, we establish a J-fold cross-validation criterion. Asymptotic optimality and weight convergence are proved for the proposed method. Simulations and an empirical application show the superior performance of the proposed method compared with other methods of model selection and averaging. History: Accepted by Ram Ramesh, Area Editor for Data Science and Machine Learning. Funding: This work was supported by the Beijing Natural Science Foundation [Grant Z240004], Japan Society for the Promotion of Science (KAKENHI) [Grant 22H00833 to Q. Liu], the CAS Project for Young Scientists in Basic Research [Grant YSBR-008], and the National Natural Science Foundation of China [Grants 71925007, 72091212 and 72495124]. Supplemental Material: The software that supports the findings of this study is available within the paper and its Supplemental Information ( https://pubsonline.informs.org/doi/suppl/10.1287/ijoc.2023.0291 ) as well as from the IJOC GitHub software repository ( https://github.com/INFORMSJoC/2023.0291 ). The complete IJOC Software and Data Repository is available at https://informsjoc.github.io/ .
Read full abstract