The general Gauss–Markov model, Y = Xβ + e, E(e) = 0, Cov(e) = σ2V, has been intensively studied and widely used. Most studies consider covariance matrices V that are nonsingular but we focus on the most difficult case wherein C(X), the column space of X, is not contained in C(V). This forces V to be singular. Under this condition there exist nontrivial linear functions of Q′Xβ that are known with probability 1 (perfectly) where \({C(Q)=C(V)^\perp}\). To treat \({C(X) \not \subset C(V)}\), much of the existing literature obtains estimates and tests by replacing V with a pseudo-covariance matrix T = V + XUX′ for some nonnegative definite U such that \({C(X) \subset C(T)}\), see Christensen (Plane answers to complex questions: the theory of linear models, 2002, Chap. 10). We find it more intuitive to first eliminate what is known about Xβ and then to adjust X while keeping V unchanged. We show that we can decompose β into the sum of two orthogonal parts, β = β0 + β1, where β0 is known. We also show that the unknown component of Xβ is \({X\beta_1 \equiv \tilde{X} \gamma}\), where \({C(\tilde{X})=C(X)\cap C(V)}\). We replace the original model with \({Y-X\beta_0=\tilde{X}\gamma+e}\), E(e) = 0, \({Cov(e)=\sigma^2V}\) and perform estimation and tests under this new model for which the simplifying assumption \({C(\tilde{X}) \subset C(V)}\) holds. This allows us to focus on the part of that parameters that are not known perfectly. We show that this method provides the usual estimates and tests.