Abstract

Multiple regression analysis specifies a linear relation between a dependent variable and a set of independent variables. When the independent variables are non-stochastic, and the error terms are homoscedastic and serially independent, the ordinary least squares estimation of the parameters yields the best linear unbiased estimates. But when there is a set of linear regression equations whose error terms are contemporaneously correlated, then the ordinary least squares estimation of each of the equations separately is not the ‘best’ estimation procedure. When the parameters of contemporaneous correlation are known then it is possible to obtain unbiased estimates with smaller variance than the corresponding ordinary least squares estimates by estimating all the regression equations jointly using the Aitken’s generalised least squares.1 In the absence of information on these parameters Professor Zellner [5] suggested the use of estimates of these parameters from residuals of the ordinary least squares. This procedure is called the ‘seemingly unrelated regression equations’ (SURE) procedure.KeywordsEstimation ProcedureLinear Regression EquationAmerican Statistical AssociationUnrelated RegressionAsymptotic BiasThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call