Abstract
We provide a new, concise derivation of necessary and sufficient conditions for the explicit characterization of the general nonnegative-definite covariance structure V of a general Gauss-Markov model with E(y) and Var(y) such that the best linear unbiased estimator, the weighted least squares estimator, and the least squares estimator of Xβ are identical. In addition, we derive a representation of the general nonnegative-definite covariance structure V defined above in terms of its Moore-Penrose pseudo-inverse.
Highlights
We consider the general Gauss-Markov model y = Xβ +, (1)where y is an n×1 vector of observations, X is an n× p known fixed, non-null model matrix such that rank(X) = p, β is a p × 1 vector of unknown model parameters, and is an n × 1 vector of random perturbations such that E( ) = 0n×1 and Var( ) = V, where V is a known n × n non-null, symmetric nonnegative-definite (n.n.d.) matrix
Concise derivation of necessary and sufficient conditions for the explicit characterization of the general nonnegative-definite covariance structure V of a general Gauss-Markov model with E(y) and Var(y) such that the best linear unbiased estimator, the weighted least squares estimator, and the least squares estimator of Xβ are identical
We present a concise proof of the explicit characterization of the general n.n.d
Summary
We derive a representation of the general nonnegative-definite covariance structure V defined above in terms of its Moore-Penrose pseudo-inverse. In this paper we give two characterizations of the general n.n.d. error covariance structure V in the Gauss-Markov model {y, Xβ, V} for which Xβ BLU = Xβ WLS = Xβ LS where y ∈ C (X : V). We define these covariance matrices to be BLU-WLS-
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have