Abstract

Regression methods are perhaps the most widely used statistical tools in data analysis. When several response variables are studied simultaneously, we are in the sphere of multivariate regression. The usual description of the multivariate regression model, that relates the set of m multiple responses to a set of n predictor variables, assumes implicitly that the m × n regression coefficient matrix is of full rank. It can then be demonstrated that the simultaneous estimation of the elements of the coefficient matrix, by least squares or maximum likelihood estimation methods, yields the same results as a set of m multiple regressions, where each of the m individual response variables is regressed separately on the predictor variables. Hence, the fact that the multiple responses are likely to be related is not involved in the estimation of the regression coefficients as no information about the correlations among the response variables is taken into account. Any new knowledge gained by recognizing the multivariate nature of the problem and the fact that the responses are related is not incorporated when estimating the regression parameters jointly. There are two practical concerns regarding this general multivariate regression model. First, the accurate estimation of all the regression coefficients may require a relatively large number of observations, which might involve some practical limitations.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.