Abstract Given matrices A A and B B of the same order, A A is called a section of B B if R ( A ) ∩ R ( B − A ) = { 0 } {\mathscr{R}}\left(A)\cap {\mathscr{R}}\left(B-A)=\left\{0\right\} and R ( A T ) ∩ R ( ( B − A ) T ) = { 0 } {\mathscr{R}}\left({A}^{T})\cap {\mathscr{R}}\left({\left(B-A)}^{T})=\left\{0\right\} , where R ( . ) {\mathscr{R}}\left(.) denotes the range (column space) of a matrix argument and the superscript T T stands for the transpose of a matrix. A matrix G G is called positive semidefinite (p.s.d.) if x T G x {x}^{T}Gx is nonnegative for all real vector x x . However, we refer to a symmetric p.s.d. matrix simply as a p.s.d. matrix as the applications discussed in this article are concerned with only symmetric p.s.d. matrices. An n × n n\times n p.s.d. matrix G G admits of a symmetric section G X {G}_{X} of G G with regard to an n × k n\times k matrix X X such that R ( G X ) = R ( G ) ∩ R ( X ) {\mathscr{R}}\left({G}_{X})={\mathscr{R}}\left(G)\cap {\mathscr{R}}\left(X) . In this article, sections of the type G X {G}_{X} are used in minimization of quadratic functions under linear constraints and splitting of vector random variables into uncorrelated vector random variables. In the general Gauss-Markoff model, y = X β + ε y=X\beta +\varepsilon , with design matrix X X and a singular covariance matrix σ 2 G {\sigma }^{2}G of ε \varepsilon , y y is decomposed into four uncorrelated vector random variables as y = M 1 y + M 2 y + M 3 y + M 4 y y={M}_{1}y+{M}_{2}y+{M}_{3}y+{M}_{4}y , where M i , i = 1 , 2 , 3 {M}_{i},i=1,2,3 consist of sections of G G and X X T X{X}^{T} , and M 4 {M}_{4} is a matrix whose row space is the null space of G + X X T G+X{X}^{T} .