The classical methods of least squares for (1) nonorthogonal polynomials (Gauss) and (2) orthogonal polynomials (Gram-Schmidt) are shown to possess a common underlying fundamental symmetry which is unifiable completely by a single simple theory. The solution to the formidable number crunching problem associated with computing orthogonal polynomials for irregularly spaced data which has plagued mathematicians and computer scientists for a long time, is obtained easily by extending the powerful, dimensionally invariant High Speed Matrix Generator (HSMG) theory. In addition, this theory is shown to be reducible to a simple equation which relates the square of a Maclaurin series of any degree, N, to its linear counterpart, that is the same Maclaurin series of degree 2 N. In the context of the unified theory, the Gauss process is shown to be (1) less computationally efficient and (2) a mere subset of the Gram-Schmidt theory. Hence, the Gauss method is no longer considered to be a useful method for computing polynomial fits. This new least-squares theory of computational efficiency is considered to have extensive applications in all branches of science, engineering, physics and mathematics. In fact, it can be shown that all linear least-squares (regression) problems can be transformed ultimately into a weighted Maclaurin series and hence, the given equation represents a complete and total unification of linear least squares in general.