Restricted Maximum Likelihood (REML) is the most recommended approach for fitting a Linear Mixed Model (LMM) nowadays. Yet, as ML, REML suffers the drawback that it performs such a fitting by assuming normality for both the random effects and the residual errors, a dubious assumption for many real data sets. Now, there have been several attempts at trying to justify the use of the REML likelihood equations outside of the Gaussian world, with varying degrees of success. Recently, a new fitting methodology, code named 3S, was presented for LMMs with only added assumption (to the basic ones) that the residual errors are uncorrelated and homoscedastic. Specifically, the 3S-A1 variant was designed and then shown, for Gaussian LMMs, to differ only slightly from ML estimation. In this article, using the same 3S framework, we develop another iterative nonparametric estimation methodology, code named 3S-A1.RE, for the kind of LMMs just mentioned. However, we show that if the LMM is, indeed, Gaussian with i.i.d. residual errors, then the set of estimating equations defining any 3S-A1.RE iterative procedure is equivalent to the set of REML equations, but while including the nonnegativity constraints on all variance estimates, as well as positive semi-definiteness on all covariance matrices. In numerical tests on some simulated and real world clustered and longitudinal data sets, our new methods proved to be highly competitive when compared to the traditional REML in the R statistical software.