Abstract
Random vectors, their expectation and dispersion matrix are reviewed. The concept of estimability in a linear model is introduced and the form of the best linear unbiased estimate of an estimable function is derived. The full rank case of the result, which is the Gauss–Markov Theorem, is stated. The Hadamard inequality for the determinant of a positive semidefinite matrix is proved and its application to weighing designs is discussed. An unbiased estimate for the error variance in terms of the residual sum of squares is obtained. The special case of one-way classification is described in greater detail. A general linear model with a possibly singular variance-covariance matrix is considered and the best linear unbiased estimate of an estimable function as well as an estimate of the error variance are obtained.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have