Abstract

The final prediction error (FPE) criterion has been used widely in model selection. The criterion for a linear regression model with k parameters can be written as RSS( k) + λ k σ 2, where RSS( k) is the residual sums of squares, σ 2 is an unbiased estimate of the error variance and λ is a penalty for complexity. This article considers the simplest situation where the choice is between two Gaussian linear regression models with σ 2 assumed to be known. We define a signal to noise ratio b for a regression model and use b to restrict the parameter space. The loss function is chosen to be the squared prediction error. Values of λ that are minimax and values of λ that are admissible are found as a function of b.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.