Abstract

This paper is concerned with the problem of constructing a good predictive distribution relative to the Kullback–Leibler information in a linear regression model. The problem is equivalent to the simultaneous estimation of regression coefficients and error variance in terms of a complicated risk, which yields a new challenging issue in a decision-theoretic framework. An estimator of the variance is incorporated here into a loss for estimating the regression coefficients. Several estimators of the variance and of the regression coefficients are proposed and shown to improve on usual benchmark estimators both analytically and numerically. Finally, the prediction problem of a distribution is noted to be related to an information criterion for model selection like the Akaike information criterion (AIC). Thus, several AIC variants are obtained based on proposed and improved estimators and are compared numerically with AIC as model selection procedures.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.