Abstract
In this chapter, properties of least squares estimates are examined for the model $$\begin{array}{*{20}{c}} {Y = X\beta + e,}&{E(e) = 0,}&{Cov(e)} \end{array} = {\sigma ^2}I$$ The chapter begins with a discussion of the concept of estimability in linear models. Section 2 characterizes least squares estimates. Sections 3, 4, and 5 establish that least squares estimates are best linear unbiased estimates, maximum likelihood estimates, and minimum variance unbiased estimates. The last two of these properties require the additional assumption e ~ N(0, σ 2 I)Section 6 also assumes that the errors are normally distributed and presents the distributions of various estimates. From these distributions various tests and confidence intervals are easily obtained. Section 7 examines the model $$\begin{array}{*{20}{c}} {Y = X\beta + e,}&{E(e) = 0,}&{Cov(e)} \end{array} = {\sigma ^2}V$$ where V is a known positive definite matrix. Section 7 introduces weighted least squares estimates and presents properties of those estimates. Section 8 presents the normal equations and establishes their relationship to least squares and weighted least squares estimation. Section 9 discusses Bayesian estimation.KeywordsMean Square ErrorBayesian AnalysisUnbiased EstimateNormal EquationPrediction IntervalThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.