Abstract

Regression Analysis is at the center of almost every Forecasting technique, yet few people are comfortable with the Regression methodology. We hope to greatly improve the level of comfort with this article. Here we briefly discuss the theory behind the methodology and then outline a step-by-step procedure, which will allow almost everyone to construct a Regression Forecasting function for both the linear and Multivariate case. The Linear Regression is shown to be a special case of the multivariate problem. Also discussed, in addition to model formation and estimation, is model testing (to establish statistical significance of factors) and the Procedure by which the final regression equation is obtained from the estimated equation. The Final Regression Equation is retained and used as the forecasting equation. A hand solution is derived for a relatively small sample problem, and this solution is compared to the MINITAB-derived solution to establish confidence in the statistical tool, which then can be used exclusively for larger problems.

Highlights

  • ESTIMATION FOR THE MULTIVARIATE PROBLEM Multivariate Regression analysis, in which an equation is derived that connects the value of one dependent variable (Y) to the values of p independent variables X1, X 2,...X p, starts with a given multivariate data set and uses the Least Squares method to assign the best possible values to the unknown multipliers found in the model we wish to estimate

  • To estimate the Multivariate model, we use the Least Squares Methodology, which calls for the formation of the Quadratic function: Archives of Business Research (ABR)

  • Linear Regression can be considered as a special case of the more general Multivariate Regression Model, which can be analyzed efficiently by using matrix methods

Read more

Summary

INTRODUCTION

AND MODEL ESTIMATION FOR THE MULTIVARIATE PROBLEM Multivariate Regression analysis, in which an equation is derived that connects the value of one dependent variable (Y) to the values of p independent variables X1, X 2 ,...X p , starts with a given multivariate data set and uses the Least Squares method to assign the best possible values to the unknown multipliers found in the model we wish to estimate. ) , H 0 is rejected and we conclude that the entire regression (including the constant) is significant (to the calculation of the Y value). E) A Multivariate Example The sales manager of a certain firm believes that Sales Ability depends on a salesman’s Verbal Reasoning Ability and Vocational Interest He is interested in constructing a regression equation to use in future hiring, to predict a candidate’s success as a salesman, i.e. he wants to derive the regression equation. Note that the values of variable X1 are all equal to 1 because we want our regression equation to have a constant term. ), for both a values, we reject H 0 ( H 0 : The entire equation (excluding the constant) is not significant) and conclude that the entire regression equation, excluding the constant, is significant.

SUMMARY OF MULTIVARIATE PROCEDURE
Findings
CONCLUSIONS
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.