In this paper, we prove strong consistency of an estimator by the truncated singular value decomposition for a multivariate errors-in-variables linear regression model with collinearity. This result is an extension of Gleser's proof of the strong consistency of total least squares solutions to the case with modern rank constraints. While the usual discussion of consistency in the absence of solution uniqueness deals with the minimal norm solution, the contribution of this study is to develop a theory that shows the strong consistency of a set of solutions. The proof is based on properties of orthogonal projections, specifically properties of the Rayleigh-Ritz procedure for computing eigenvalues. This makes it suitable for targeting problems where some row vectors of the matrices do not contain noise. Therefore, this paper gives a proof for the regression model with the above condition on the row vectors, resulting in a natural generalization of the strong consistency for the standard TLS estimator.